The convergence of the random variables in probability implies weak convergence of the corresponding probability measures

probabilityprobability theory

Prove that if $\xi_n \to \xi$ in probability, then $P_{\xi_n } \Rightarrow P_\xi$, that is the convergence of the random variables in probability implies weak convergence of the corresponding probability measures.

Definition: Let $(X,d)$ be a metric space. $C_b(X)$ denotes the space of bounded continuous functions on X. The sequence $P_n$ converges weakly to the probability measure P if, for each $f \in C_b(X)$,

$$\lim_{n \to \infty} \int_X f(x) dP_n(x)=\lim_{n \to \infty} \int_X f(x) dP(x)$$

How to prove the above argument?

Best Answer

Let $S$ be a metric space with distance $\mathrm d$, and $\xi_n$, $\xi\in S$ such that $\lim_{n\rightarrow\infty}\mathbb P(|\xi_n-\xi|\ge\varepsilon)=0$ for all $\varepsilon>0$. Let $f:S\rightarrow\mathbb R$ be bounded and $L$-Lipschitz. Then we have \begin{aligned} |\mathbb E[f(\xi_n)]-\mathbb E[f(\xi)]| &\le\mathbb E[|f(\xi_n)-f(\xi)|]\\ &=\mathbb E[\unicode{120793}\{|X-X_n|\le\varepsilon\}|f(\xi_n)-f(\xi)|] +\mathbb E[\unicode{120793}\{|X-X_n|>\varepsilon\}|f(\xi_n)-f(\xi)|]\\ &\le \mathbb E[\unicode{120793}\{|X-X_n|\le\varepsilon\}|f(\xi_n)-f(\xi)|]+2\|f\|_\infty\mathbb P(|X-X_n|>\varepsilon)\\ &\le \mathbb E[\unicode{120793}\{|X-X_n|\le\varepsilon\}L|\xi_n-\xi|]+2\|f\|_\infty\mathbb P(|X-X_n|>\varepsilon)\\ &\le L\varepsilon\mathbb P(|X-X_n|>\varepsilon)+2\|f\|_\infty\mathbb P(|X-X_n|>\varepsilon). \end{aligned} using Jensen's inequality with $|\cdot|$, the triangle inequality to obtain $2\|f\|_\infty$, Lipschitz continuity, and the $\varepsilon$-bound from the event. Taking the limit and using convergence in probability gives \begin{aligned} \lim_{n\rightarrow\infty}|\mathbb E[f(\xi_n)]-\mathbb E[f(\xi)]|\le L\varepsilon. \end{aligned} Taking $\varepsilon$ to $0$ completes the proof.

EDIT: We provide a few more details. The triangle inequality yields $\mathbb E[\unicode{120793}\{|X-X_n|>\varepsilon\}|f(X_n)-f(X)|]\le\mathbb E[\unicode{120793}\{|X-X_n|>\varepsilon\}|f(X_n)|+|f(X)|]$, where we recall that $|x-y|=|x-0+0-y|\le|x-0|+|y-0|=|x|+|y|$. Next, we use $|f(x)|\le\sup_{y}|f(y)|=\|f\|_\infty$, i.e.~the absolute of the function value can be upper bounded by the (uniform or) $\infty$-norm. Finally, we have \begin{aligned} \lim_{n\rightarrow\infty}|\mathbb E[f(\xi_n)]-\mathbb E[f(\xi)]| &\le\lim_{n\rightarrow\infty} (L\varepsilon\mathbb P(|X-X_n|>\varepsilon)+2\|f\|_\infty\mathbb P(|X-X_n|>\varepsilon))\\ &= L\varepsilon(1-\lim_{n\rightarrow\infty}\mathbb P(|X-X_n|>\varepsilon))+2\|f\|_\infty\lim_{n\rightarrow\infty}\mathbb P(|X-X_n|>\varepsilon)\\ &=L\varepsilon \end{aligned} by the assumption. Since we could choose any $\varepsilon>0$ and the left hand side does not depend on $\varepsilon$, we make take the limit on both sides to obtain \begin{aligned} \lim_{n\rightarrow\infty}|\mathbb E[f(\xi_n)]-\mathbb E[f(\xi)]| =\lim_{\varepsilon\rightarrow 0} \lim_{n\rightarrow\infty}|\mathbb E[f(\xi_n)]-\mathbb E[f(\xi)]| \le\lim_{\varepsilon\rightarrow 0}L\varepsilon=0. \end{aligned} This implies that $\lim_{n\rightarrow\infty}\mathbb E[f(\xi_n)]=\mathbb E[f(\xi)]$. Since this holds for all bounded Lipschitz continuous functions $f$, we obtain convergence in distribution (for the random variables and their laws).

Related Question