[Math] Convergence in probability implies convergence in distribution

convergence-divergenceprobabilityprobability theoryrandom variablesweak-convergence

A sequence of random variables $\{X_n\}$ converges to $X$ in probability if for any $\varepsilon > 0$,
$$P(|X_n-X| \geq \varepsilon) \rightarrow 0$$

They converge in distribution if
$$F_{X_n} \rightarrow F_X$$
at points where $F_X$ is continuous.

(There is another equivalent definition of converge in distribution in terms of weak convergence.)

It seems like a very simple result, but I cannot think of a clever proof.

Best Answer

A slicker proof (and more importantly one that generalizes) than the one in the wikipedia article is to observe that $X_n \Longrightarrow X$ if and only if for all bounded continuous functions $f$ we have $E f(X_n) \to E f(X)$. If you have convergence in probability then you can apply the dominated convergence theorem (recalling that $f$ is bounded and that for continuous functions $X_n \to X$ in probability implies $f(X_n) \to f(X)$ in probability) to conclude that $E |f(X_n) - f(X)| \to 0$, which implies the result.

Related Question