Jensen’s Inequality in Measure Theory

convex-analysismeasure-theoryreal-analysis

Here cites its original claim from http://www.math.tau.ac.il/~ostrover/Teaching/18125.pdf.

Theorem 3.1 Jensen's Inequality

Let $(X,\mathcal{M},\mu)$ be a probability space (a measure space with $\mu(X) = 1)$, $f: X \to \mathbb R \in L^1(X, \mu)$, and $\psi: \mathbb R \to \mathbb R$ be a convex function, then $$\psi\int_X f d\mu \le \int_X (\psi \circ f)d\mu$$.

Proof:

Since $\psi$ is convex, at each $x_0 \in \mathbb R$, there exist $a,b \in \mathbb R$ such that $\psi(x_0) = ax_0 + b$ and $\psi(x) \ge ax + b, \forall x \in \mathbb R$, (here, $y = ax + b$ defines a supporting plane of the epigraph of $\psi$ at $x_0$). Let $x_0 = \int_X fdµ$, then we have $$\psi(\int_Xf d\mu) = \psi(x_0) = ax_0+b=a\int_Xf\mu + b = \int(af+b)d\mu \le \int(\psi\circ f)d\mu$$, q. e. d.

Generally I have two questions. First is how could the $a, b$ be guaranteed to exist? The other is if the $\mu$ is not from probability space and just Lebesgue measure or even general measure, will this Jensen inequality still hold (I think the requirement is $\mu$ should be finite measure at least)?

Best Answer

A counterexample with infinite measures. Let $m$ be the Lebesgue measure on $[1,\infty)$, $f(x)=x^{-1}$, and $\varphi(x)=x^2$. Then

$$\varphi\left(\int f\,dm\right)=\infty \quad\text{and}\quad \int \varphi\circ f\,dm=1.$$