[Math] Divergent series of random variables

convergence-divergenceprobability theoryrandom variables

I've been trying to prove that given a sequence of independent random variables with identical distribution $\{X_n\}_{n \in \mathbb{N}}$ such that $P(X_1 \neq 0)>0$, so also $P(X_i \neq 0) >0 \ \ \forall i$, the series $$\sum_{n \in \mathbb{N}} X_n \ \ \text{is diverent almost surely}.$$

That is, I need to prove that $P(\omega| \ \exists \varepsilon>0 : \forall N \in \mathbb{N} \exists m, n \ge N : |S_m(\omega) – S_n(\omega)| > \varepsilon)$.

We are dealing with a series of random variables, so I thought I could use Borel-Cantelli lemma. It implies that the series $\sum_{n \in \mathbb{N}} P(X_m \neq 0) $ is divergent.

This is because $\forall n \in \mathbb{N} : P(X_n \neq 0) = c >0$, because the variables have identical distributions.

But I don't think it is helpful, because by Markov's inequality we also have for $a \ge 0$:

$$\mathbb P (|X| \ge a) \le \frac{\mathbb E(|X|)}{a}.$$

So it would seem that the series is divergent in $L^1.$ But why is it divergent almost surely?

Could I use the law of large numbers? If so, how? The variables satisfy all necessary conditions for the following to hold:

$$\frac{S_n}{n} = \frac{X_1 + … + X_n}{n} \to \mathbb{E}X_1 \ \ \text{a.s.}$$

But does that imply that the series is divergent?

Could you help me finish that?

Best Answer

Hints:

  1. Since $\mathbb{P}(X_1 \neq 0)>0$, there exists $\epsilon>0$ such that $\mathbb{P}(|X_1|>\epsilon)>0$.
  2. Conclude from $$\sum_{n \geq 1} \mathbb{P}(|X_n|>\epsilon)= \sum_{n \geq 1} \mathbb{P}(|X_1|>\epsilon) =\infty$$ and the Borel-Cantelli lemma that $|X_n(\omega)|>\epsilon$ happens infinitely often for almost all $\omega \in \Omega$.
  3. Deduce that $$\sum_{n \geq 1} X_n $$ is divergent almost surely.
Related Question