I've been trying to prove that given a sequence of independent random variables with identical distribution $\{X_n\}_{n \in \mathbb{N}}$ such that $P(X_1 \neq 0)>0$, so also $P(X_i \neq 0) >0 \ \ \forall i$, the series $$\sum_{n \in \mathbb{N}} X_n \ \ \text{is diverent almost surely}.$$
That is, I need to prove that $P(\omega| \ \exists \varepsilon>0 : \forall N \in \mathbb{N} \exists m, n \ge N : |S_m(\omega) – S_n(\omega)| > \varepsilon)$.
We are dealing with a series of random variables, so I thought I could use Borel-Cantelli lemma. It implies that the series $\sum_{n \in \mathbb{N}} P(X_m \neq 0) $ is divergent.
This is because $\forall n \in \mathbb{N} : P(X_n \neq 0) = c >0$, because the variables have identical distributions.
But I don't think it is helpful, because by Markov's inequality we also have for $a \ge 0$:
$$\mathbb P (|X| \ge a) \le \frac{\mathbb E(|X|)}{a}.$$
So it would seem that the series is divergent in $L^1.$ But why is it divergent almost surely?
Could I use the law of large numbers? If so, how? The variables satisfy all necessary conditions for the following to hold:
$$\frac{S_n}{n} = \frac{X_1 + … + X_n}{n} \to \mathbb{E}X_1 \ \ \text{a.s.}$$
But does that imply that the series is divergent?
Could you help me finish that?
Best Answer
Hints: