Showing that a sequence of random variables converges in probability to a constant $a$

convergence-divergenceprobabilityrandom variables

Suppose that the sequence of random variables $X_1, X_2, \ldots$ converges in probability to the constant $a$, and the function $f(x)$ (defined on $\mathbb{R}$) is continuous at $a$. Is the sequence of random variables $$(f(X_n))_{n=1}^\infty$$ convergent in probability? If yes, what is its limit?

This problem is asking me to check convergence of a sequence of random variables. We know that, for convergence, we need to check that
$$
\lim_{n \to \infty} F_{X_n}(x) = F_X(x)
$$

$\forall x$ at which $F_{X}(x)$ is continuous, right?

What I don't understand is how to show that it is convergent to a constant $a$ since we have $a$.

Can someone help me finding the right way/approach to do this problem? Appreciate your support and help.

Best Answer

You are mistaken about the definition of convergence in probability. The one you give is the definition of convergence in distribution.

A sequence $(X_n)_n$ of random variables converges in probability to a random variable $X$, iff for all $\varepsilon>0$

$$\lim_{n\to\infty}P\big(|X_n-X| > \varepsilon\big) = 0.$$

Convergence in probability implies convergence in distribution. The reverse direction only holds when the limit is almost surely constant, which is given in your example. But it is not true in general. So, you could argue in this specific example via convergence in distribution, but it is not really necessary. Here is a more direct approach:

We want to show that $f(X_n)\to f(a)$ in probability. Let $\varepsilon>0$ and $k\in\mathbb{N}$ be arbitrary. Define $$A_k:=\{x\in\mathbb{R}\mid \lvert x-a \rvert <1/k \;\land\; \lvert f(x)-f(a) \rvert >\varepsilon \,\}.$$ (Intuitively you could think of $A_k$ as the event on which $k$ is "too small to achieve that $f(x)$ and $f(a)$ are close enough".) By continuity of $f$, $A_k$ converges to $\emptyset$ as $k\to\infty$. Note that $$\{\lvert f(X_n-f(a)\rvert >\varepsilon\}=\{\lvert X_n-a\rvert \geq 1/k\}\mathbin{\dot{\cup}}A_k.$$ Hence, $$P(\{\lvert f(X_n-f(a)\rvert >\varepsilon\})=P(\{\lvert X_n-a\rvert \geq 1/k\})+P(A_k)\to 0$$ as $k,n\to \infty$, where we have used the hypothesis that $X_n\to a$ in probability and the continuity of probability measures.

You might find it interesting that similar results hold in much more generality. See the continuous mapping theorem.

Related Question