Convergence in probability implies mean squared convergence

convergence-divergenceprobability theory

Let $(\Omega, \mathcal{F}, \mathbb{P})$ be a probability space. Let $(X_n)_{n \in \mathbb{N}}$ be a sequence of $\mathcal{F}$ measurable random variables. Let $X$ be another $\mathcal{F}$ measurable random variable. I have $X_n \rightarrow X $ in probability. Additionally, $\mathbb{P}(|X_n|<L) = 1 \hspace{3mm} \forall \hspace{2mm}n \in \mathbb{N}$, where $L$ is a constant independent of $n$. I have to show that $X_n \rightarrow X$ in mean squared sense, i.e. as $n \rightarrow \infty$, $\mathbb{E}(X_n – X)^2 \rightarrow 0$. How do I go about this? Thanks.

Best Answer

Convergence in probability: For any $\delta>0$, $\lim_{n\to\infty}\mathbb{P}(|X_n-X|>\delta)=0$.

Also, since $\mathbb{P}(|X_n|<L)=1$, we have that $|X_n|<L$ almost surely for all $X_n$. Since convergence in probability implies almost-everywhere convergence of a subsequence, we also have that $\mathbb{P}(|X|<L)=1$, i.e. $|X|<L$ almost surely. Now let $\delta>0$. We have $$\mathbb{E}[X_n-X]^2=\int|X_n-X|^2=\int_{\{|X_n-X|>\delta\}}|X_n-X|^2+\int_{\{|X_n-X|<\delta\}}|X_n-X|^2\leq$$ $$\leq\int_{\{|X_n-X|>\delta\}}|X_n-X|^2+\delta^2\leq\mathbb{P}(|X_n-X|>\delta)\cdot (4L^2)+\delta^2\to\delta^2$$

Since $\delta>0$ was arbitrary and we have that $\limsup_{n\to\infty}\mathbb{E}|X_n-X|^2\leq\delta^2$, we conclude that $\mathbb{E}|X_n-X|^2\to0$.