$ \frac{X_n}{n}$ does not converge to $0$ almost surely

borel-cantelli-lemmasconvergence-divergenceprobability theoryself-learning

Suppose that $\sigma_{n}^{2} \geq 0, n \geq 1,$ satisfy $\sum_{n=1}^{\infty} \frac{\sigma_{n}^{2}}{n^{2}}=\infty$ and without loss of generality that $\sigma_{n}^{2} \leq n^{2}$ for all $n \geq 1 .$ Show that there are independent random variables $X_{n}, n \geq 1$ with $E\left[X_{n}\right]=0$ and $\operatorname{Var}\left(X_{n}\right) \leq \sigma_{n}^{2}$ for which $X_{n} / n$ does not converge to 0 a.s., and hence $n^{-1} \sum_{i=1}^{n} X_{i}$ does not converge to 0 a.s.

If we set $P(X_n=n) = P(X_n = -n) = \frac{{\sigma_n}^2}{2n^{2}}$ and $P(X_n=0)= 1 – \frac{{\sigma_n}^2}{2n^{2}}$

$\sum\limits_{n=1}^{\infty} \frac{{\sigma_n}^2}{2n^{2}} =\infty$

so we get $\sum\limits_{n=1}^{\infty} P(X_n=n) =\sum\limits_{n=1}^{\infty} P(X_n=-n) =\infty$

After this point, I believe I have to use Borel Cantelli Divergence lemma.

but I am getting confused,

we get $P(X_n = n $ i.o.$)=1$

why is the above the same as
implies $P(X_n \geq n $ i.o.$)=1$

and how do we complete and show the remaining of the proof in detail?

Should I use Cesaro's averages theorem?

Best Answer

$X_n$ takes only the values $-n,n$ and $0$. So $P(X_n=n i.o. )=P(X_n \geq n i.o.)$.

For the first part: $\sum P(X_n = n)=\infty$. By Borel Cantelli Lemma this implies that the limsup of the events $(X_n = n)$ has probability $1$. So $X_n/n=1 $ for infinitely many values of $n$ with probability $1$. Hence $P(\frac {X_n} n \to 0) =0$.

For the second part: If $\frac {S_n} n \to 0$ then $\frac {X_n} n=\frac {S_n-S_{n-1}} n =\frac {S_n} n-\frac {S_{n-1}} {n-1} \frac {n-1} n\to 0$ which is false with probability $1$.