Convergence of the expectation of a bounded random variable

convergence-divergenceexpected valueprobability theorysequences-and-series

I'm trying to prove the following statement. Let $(X_n)$ be a sequence of random variables such that $\lvert X_n \rvert \leq C$ almost surely. If $X_n$ converges in probability to 0, then $\mathbb{E}[\lvert X_n \rvert]$ converges to 0.

The converse is straight forward by Markov theorem. But I don't know how to start for this one. I know that I only need to prove that $\forall \epsilon > 0, \exists N \in \mathbb{N}, \forall n \geq N, \mathbb{E}[\lvert X_n \rvert] < \epsilon$.

Best Answer

Let $\varepsilon>0$. Then $$ \mathbb E[\vert X_n\vert]=\mathbb E[\vert X_n\vert1_{\{\vert X_n\vert\ge\varepsilon\}}]+\mathbb E[\vert X_n\vert1_{\{\vert X_n\vert<\varepsilon\}}]\le C\mathbb P(\vert X_n\vert\ge\varepsilon)+\varepsilon, $$ hence $\limsup_{n\to+\infty}\mathbb E[\vert X_n\vert]\le\varepsilon$. For $\varepsilon\to0$ we deduce that $\lim\sup_{n\to+\infty}\mathbb E[\vert X_n\vert]=0$, or equivalently $\mathbb E[\vert X_n\vert]$ converges to $0$ as $n\to+\infty$.