Convergence in probability implies convergence almost surely

convergence-divergenceprobability theory

Let $(X_n)$ be a sequence of independent and positive random variables. Let $S_n = X_1 + \dots + X_n$. Show that $S_n$ converges almost surely if and only if it converges in probability.

The first implication is well-known. But the other one is none trivial as this exercise is an instance of a statement that is generally false.

The way I think I should organize my proof is as follows:

  1. Define $A_k = \{\lvert S_k – S \rvert \geq \epsilon \}$.
  2. Majorize $\mathbb{P}(\lvert S_k – S \rvert \geq \epsilon)$ using Markov inequality.
  3. Show that the upper-bounding general term series is convergent.
  4. Conclude that $\sum \mathbb{P}(A_k) < \infty$
  5. Conclude by Borel-Cantelli lemma, $\mathbb{P}(\limsup A_k) \rightarrow_k 0$.

However, I am stuck at:

$$\mathbb{P}(\lvert S_k – S \rvert \geq \epsilon) \leq \frac{\mathbb{E}[\lvert S_k – S \rvert]}{\epsilon} \cdot$$

So far I haven't used the fact that $S_n = X_1 + … + X_n$ nor that $(X_n)$ is a sequence of positive variables.

Best Answer

Independence is not required at all. $(S_n)$ is an increasing sequence of real numbers, so $\lim S_n$ exists a.s. (The limit may be $\infty$). But convergence in probability implies convergence a.s. for a subsequence from which it follows that $\lim S_n$ is finite a.s..