Equivalence between almost sure convergence and convergence in probability

convergence-divergenceprobability theory

Here is an exercise from Resnick – Probability path (6.17):
"In a discrete probability space, convergence in probability and convergence almost sure are equivalent".

I'm not sure to have understood the meaning of the word 'equivalent' (which sounds like '=' to me). Perhaps I believe only two ways are possible to prove it:

  1. Using the fact that: $X_n \to X$ i.p. iff every subsequence $X_{n_k} \to X$ a.s. ;

  2. Showing that $P[(|X_n-X|>\epsilon)\, i.o.]= \lim_{n\to \infty} P[|X_n-X|>\epsilon]=0$ (which is against Fatou's lemma indeed).

However, nor of the two seems to me to be possible, so I have no idea how to go forwards.

Any help would be very much appreciated. Many thanks!

Best Answer

Equivalence of certain conditions means that they imply each other. The direction "almost sure convergence $\Rightarrow$ convergence in probability" is always true, so only the other direction has to be proven.

In a discrete probability space, almost sure convergence is pointwise convergence in all singleton sets of non-zero measure, because every subset of a discrete probability space is a countable union of singleton sets. So let $\omega$ be an element of the probability space with $P(\{\omega\})=p>0$. If $X(\omega)-X_n(\omega)$ didn't converge to $0$, then there is an $\varepsilon>0$ for which there are infinitely many $n\in\mathbb N$ such that $\vert X(\omega)-X_n(\omega)\vert>\varepsilon$. What does this mean for the probability $P(\vert X-X_n\vert>\varepsilon)$, and thus for convergence in probability?