Convergence in Probability for random variable

probabilityprobability distributions

Let's suppose that for a sequence of random variables $X_1, X_2, X_3, . . .$ , we know there exists a random
variable X such that $E[|X_n − X|] \to 0$ as $n \to \infty$.

I am trying to wrap my head around why $X_n$ must converge to $X$ in probability as well.

I know that we say a sequence of random variables $X_1, X_2, X_3, …$ converges in probability to a random variable $X$ if for every $\epsilon > 0, P(|X_n – X| \geq \epsilon) \to 0$ as $n \to \infty$. But I am unsure how the expectation characteristic provides insight into the probability of $X$. Any insight would be much appreciated!

Best Answer

Markov's Inequality says that if $X$ is a random variable with only non-negative values, then for any value $a > 0$, $P(X\geq a) \leq \frac{E[X]}{a}$.

The expected value we have to work with is $E[|X_n-X|]$, and $a = \epsilon$, so

$$P(|X_n-X| \geq \epsilon) \leq \frac{E[|X_n-X|]}{\epsilon}$$

Since the right hand side goes to $0$, so does the left side, and the result is proven.

Related Question