[Math] Convergence in probability of product and division of two random variables

convergence-divergenceprobability theoryrandom variables

How can I prove the following:

Let $X_i$ and $Y_i$, $i = 1, \ldots, n$, $X$ and $Y$ be random variables defined on the probability space $(\Omega, \mathcal F, \mathbb P)$ and assume that $X_n$ converges in probability to $X$ and $Y_n$ to $Y$ (also in probability). Then:

If $Y_n \ne 0$ and $Y \ne 0$ almost surely, then $X_n/Y_n$ converges in probability to $X/Y$.

I tried the following:

Let $\epsilon > 0$. We need to show that
\begin{align*}
\mathbb P(|X_n/Y_n – X/Y| > \epsilon) \xrightarrow{n \to \infty} 0.
\end{align*}
Define $Z_n := 1/Y_n$ and $Z := 1/Y$. $Z_n$ and $Z$ are well defined, since $Y_n, Y \ne 0$ a.s. It is
\begin{align*}
\mathbb P(|Z_n-Z| > \epsilon) &= \mathbb P(|1/Y_n – 1/Y| > \epsilon) \\
&= \mathbb P(|Y-Y_n|/|Y_nY| > \epsilon) \\
&= \mathbb P(|Y_n-Y| > \epsilon |Y_n||Y|) \\
&\le \mathbb P(|Y_n-Y| > \epsilon (|Y|-|Y_n-Y|)|Y|) \\
&= \mathbb P(|Y_n-Y| > \epsilon (|Y|-|Y_n-Y|)|Y|, |Y_n-Y| \le |Y|/2) \\
&\quad + \mathbb P(|Y_n-Y| > \epsilon (|Y|-|Y_n-Y|)|Y|, |Y_n-Y| > |Y|/2) \\
&\le \mathbb P(|Y_n-Y| > \epsilon Y^2/2) + \mathbb P(|Y_n-Y| > |Y|/2).
\end{align*}
Now for any $A > 0$
\begin{align*}
\mathbb P(|Y_n-Y| > \epsilon Y^2/2)
&= \mathbb P(|Y_n-Y| > \epsilon Y^2/2, |Y| \ge 1/A) + \mathbb P(|Y_n-Y| > \epsilon Y^2/2, |Y| < 1/A) \\
&\le \mathbb P\left(|Y_n-Y| > \frac{\epsilon}{2A^2}\right) + \mathbb P(|Y| < 1/A)
\xrightarrow{n \to \infty} \mathbb P(|Y| < 1/A)
\end{align*}
and similarly
\begin{align*}
\mathbb P(|Y_n-Y| > |Y|/2) &= \mathbb P(|Y_n-Y| > |Y|/2, |Y| \ge 1/A) + \mathbb P(|Y_n-Y| > |Y|/2, |Y| < 1/A) \\
&\le \mathbb P\left(|Y_n-Y| > \frac{1}{2A}\right) + \mathbb P(|Y| < 1/A)
\xrightarrow{n \to \infty} \mathbb P(|Y| < 1/A).
\end{align*}
Since we can choose $A$ arbitrarily large, we obtain
\begin{align*}
\mathbb P(|Z_n-Z| > \epsilon) \xrightarrow{n \to \infty} 0.
\end{align*}
The result follows since we know that $X_nZ_n$ converges to $XZ$ in probability.

Edit: Is it complete now? Is there maybe a shorter proof? Maybe my estimations are too long or there is shorter way to do it.

Best Answer

Isn't this a consequence of the continuous mapping theorem?

Since Y is different to zero almost surely, the function g(x)=1/x satisfies the condition $P[Y\in C^c(g)]=0$, where $C^c(g)$ denotes the set of discontinuity points of $g$ (which in this case equals the set $\{0\}$). Hence we obtain $\frac{1}{Y_n}\to \frac{1}{Y}$ in probability.

Since $X_n\to X$ in probability and $Z_n\to Z$ in probability implies $X_nZ_n\to XZ$ in probability (this is proved in Resnick's "A Probability Path" using subsequences), the result follows taking $Z_n=\frac{1}{Y_n}$ and $Z=\frac{1}{Y}$.