Does the variance of these random variables tend to zero

convex-analysisintegral-inequalityprobability theoryreal-analysisvariance

Let $X$ be a probability space, and let $F:[0,\infty) \to [0,\infty)$ be a $C^2$ strictly convex function. Suppose that $F''$ is an everywhere positive strictly decreasing function, and that $\lim_{x \to \infty} F''(x)=0$.

Let $g_n:X \to \mathbb [0,\infty)$ be measurable, with constant expectations $\int_X g_n=c>0$, and suppose that

$$\lim_{n \to \infty} \int_X F(g_n)-F(\int_X g_n)=0.$$

Is $\lim_{n \to \infty} \int_X (g_n-c)^2=0$?


A sharpened form of Jensen inequality implies that
$$
\int_X F(g_n)-F(\int_X g_n) \ge (\inf_{x}\frac{F''(x)}{2}) \int_X (g_n-c)^2.
$$

This naive estimate does not help since $\inf_{x} F''(x)=0$.

My intuition is that to have a small "Jensen gap" requires $g_n$ to obtain very high values on a domain with large measure, in order to stay at the region where $F''$ is small. This should be incompatible with the constraint $E(g_n)=c$. (More precisely $g_n$ will have to be below $c$ on a non-negligible domain, where $F''$ is bounded from below).

Best Answer

Not necessarily. For simplicity take $c=1$. Take $F(x) = e^{-x}$ for all $x \geq 0$ (so $F''(x)=F(x)$ for all $x \geq 0$). For $n \in \{1, 2, 3, ...\}$ define random variables $Y_n$ by $$ Y_n = \left\{\begin{array}{ll} 1 - \frac{1}{\sqrt{n}} & \mbox{ with prob $1-1/n$}\\ 1+ \frac{n-1}{\sqrt{n}} & \mbox{ with prob $1/n$} \end{array}\right.$$ Then

  • $E[Y_n]=1$ for all $n \in \{1, 2, 3, ...\}$.

  • $\lim_{n\rightarrow\infty} E[F(Y_n)] = F(1)$.

  • $\lim_{n\rightarrow\infty} E[(Y_n-1)^2]=1$.


On the other hand, if $Y_n$ are uniformly bounded for all $n$, so that $Y_n(\omega) \in [0,M]$ for all $n \in \{1, 2, 3, ...\}$ and all $\omega$ in the sample space, then it is true because $F:[0,M]\rightarrow\mathbb{R}$ is strongly convex with parameter $F''(M)>0$: $$ F(x) \geq F(c) + F'(c)(x-c) + \frac{F''(M)}{2}(x-c)^2 \quad \forall x \in [0,M]$$ So for all $n \in \{1, 2, 3, ...\}$ we have $$ F(Y_n) \geq F(c) + F'(c)(Y_n-c) + \frac{F''(M)}{2}(Y_n-c)^2 $$ Taking expectations of both sides and using $E[Y_n]=c$ gives $$ E[F(Y_n)] \geq F(c) + \frac{F''(M)}{2}E[(Y_n-c)^2] \quad \forall n \in \{1, 2, 3, ...\}$$ Taking the limit of both sides as $n\rightarrow\infty$ and using $\lim_{n\rightarrow\infty} E[F(Y_n)] = F(c)$ gives $$ 0 \geq \frac{F''(M)}{2}\lim_{n\rightarrow\infty} E[(Y_n-c)^2] $$ and so $\lim_{n\rightarrow\infty} E[(Y_n-c)^2]=0$.