Probability Theory – Convergence of Sequence of Random Variables and Cauchy Sequences

convergence-divergenceprobability theoryrandom variables

Let $(X_n)$ be a sequence of real random variables on $(\Omega,\mathcal A,\mathbb P)$. Then 1. and 2. are equivalent:

  1. There exists a random variable $X$, s.t. $X_n\to X$ $P$-almost sure for $n\to \infty$.
  2. $\sup_{m>n} |X_m-X_n|\to 0$ in probability for $n\to\infty$.

I tried showing $1.\Rightarrow 2.$:

I know that $X_n\to X$ $P$-almost sure means that $$P(\lim_{n\to\infty} X_n=X)=1$$ or equivalently $$(*)\quad \lim_{n\to\infty} P(\sup_{m\geq n} |X_m-X|\geq\varepsilon)=0\quad\forall\varepsilon>0$$

$\sup_{m>n} |X_m-X_n|\to 0$ in probability means \begin{equation}\lim_{n\to\infty} P(\sup_{m>n}|X_m-X_n|\geq\varepsilon)=0\quad\forall\varepsilon>0\end{equation}

This looks like a Cauchy-sequence in probability, but I don't know if I can deduce the convergence of this from $(*)$.

How can I go on proving this? Thanks for any input!

Best Answer

For the first part, use that almost sure convergence implies convergence in probability: we thus are reduced to show that $\sup_{m\geqslant n}|X_m-X|\to 0$ in probability, but there is in this case actually almost everywhere convergence.

Conversely, assume that $\mathbb P(\sup_{m\geqslant n}|X_m-X_n|\gt\varepsilon)\to 0$ for each positive $\varepsilon$. We have in particular that $(X_n)_n$ is Cauchy in probability, namely, $$\forall \varepsilon,\delta\gt 0,\exists n_0\mbox{ such that if }m,n\geqslant n_0, \mathbb P(|X_m-X_n|\gt\varepsilon)\lt\delta.$$ Taking $\delta$ of the form $2^{-k}$ we can extract an almost everywhere convergence subsequence to some random variable $X$. Then using Cauchy-ness, we can prove that $X_n\to X$ in probability. Now we go back to the assumption. This now reads $\mathbb P(\sup_{m\geqslant n}|X_m-X|\gt \varepsilon)\to 0$ for each positive $\varepsilon$. Define $Y_n:=\sup_{m\geqslant n}|X_m-X|$: the sequence $(Y_n)_n$ is non increasing and goes to $0$ in probability, hence almost everywhere.