Convergence in probability: $X_n \overset{P}{\rightarrow} X,\ Y_n \overset{P}{\rightarrow} Y \implies X_n Y_n \overset{P}{\rightarrow} XY$

probability theory

According to the Wikipedia article on convergence of random variables, the above property holds. How can we prove it?

I guess one way would be to use a second property stated in the article, namely,
$$X_n \overset{P}{\rightarrow} X,\ Y_n \overset{P}{\rightarrow} Y \implies (X_n, Y_n) \overset{P}{\rightarrow} (X, Y)$$
and subsequently apply the continuous mapping theorem with $g: (x, y) \mapsto xy$. However, the proof for this second property requires definitions of convergence in two dimensions and additional properties of the involved distance metrics.

Can we prove $X_n \overset{P}{\rightarrow} X,\ Y_n \overset{P}{\rightarrow} Y \implies X_n Y_n \overset{P}{\rightarrow} XY$ only using the one-dimensional random variables $X_n, Y_n, X, Y$?

Best Answer

One way I like, is by using subsequential arguments.

$X_{n}\xrightarrow{P}X$ if and only if for each subsequence $X_{n_{k}}$ there exists a further subsequence $X_{n_{k_{l}}}\xrightarrow{P}X$.

(The above criterion is true for convergence in distribution,i.e. weak convergence as well and is often used in functional analysis context. But note that this subsequential criteria is NOT sufficient for almost sure convergence. For pointwise convergence though, I'm sure you have studied it in real analysis and indeed, the proof of the above fact is essentially the same as the proof for real sequences).

We can use this in the following way.

Let $X_{n_{k}}Y_{n_{k}}$ be an arbitrary subsequence of $X_{n}Y_n$ . Now $X_{n_{k}}\xrightarrow{P} X$ and $Y_{n_{k}}\xrightarrow{P}$ and hence there exists subsequences of $X_{n_{k}}$ and $Y_{n_{k}}$ which converge almost surely to $X$ and $Y$. So chose a common subsequence $n_{k_{m}}$ of $n_{k}$ such that $X_{n_{k_{m}}}\xrightarrow{a.s.}X$ and $Y_{n_{k_{m}}}\xrightarrow{a.s.} Y$

Hence $X_{n_{k_{m}}}Y_{n_{k_{m}}}\xrightarrow{a.s.}XY$ and hence $X_{n_{k_{m}}}Y_{n_{k_{m}}}\xrightarrow{P}XY$

Thus each subsequence has a further subsequence that converges in probability to $XY$ and thus, the whole sequence converges in Probability to $XY$.

Proof of the claim I am using.

One way is clear that convergence in probability implies convergence for all subsequences.

So assume that $X_{n}$ does not converge in probability to $X$. Then there exists $X_{n_{k}}$ such that $P(|X_{n_{k}}-X|>\epsilon)\geq \delta_{0}$ for some $\epsilon>0$, for some $\delta_{0}>0$ and all $k\geq 1$.

Then, for this subsequence, we can find $X_{n_{k_{l}}}\xrightarrow{P}X$ and hence $P(|X_{n_{k_{l}}}-X|>\epsilon)\xrightarrow{l\to\infty} 0$. But this is not possible as $P(|X_{n_{k}}-X|>\epsilon)\geq \delta_{0}$ for all $k\geq 1$.

Hence by contradiction, $X_{n}\xrightarrow{P}X$