[Math] Convergence of expected values as random variables converge almost surely

probabilityprobability distributionsprobability theory

Let I have a sequence of random variables $X_n$ that converges to random variable $X$ almost surely as $n\to\infty$. How can I proof that $\lim_{n\to\infty}\mathcal{E}[X_n]=\mathcal{E}[X]$ where $\mathcal{E}[\cdot]$ stands for expected value?

Best Answer

In general, as pointed out by Siméon, it is not true that $\mathbb E[X_n]\to\mathbb E[X]$.

However, if the family $\{X_n,n\geqslant 1\}$ is uniformly integrable, it is OK, using a truncation and a $2\varepsilon$-argument.

If the $X_n$'s are non-negative the converse holds, namely, if $\mathbb E[X_n]\to\mathbb E[X]$ and $X_n\to X$ almost surely, then the family $\{X_n,n\geqslant 1\}$ is uniformly integrable.

Related Question