Prove that ${S_n/n}$ does not converge a.s.

almost-everywhereconvergence-divergenceprobability theory

This is an old qualifying exam question of probability theory.


Let $\{X_n\}$ be a sequence of independent random variables with $X_1=0$ and for $k\geq 2$, define $X_k$ as $$\mathbf{P}(X_k=k)=\mathbf{P}(X_k=-k)=\frac{1}{2k\log k},\mathbf{P}(X_k=0)=1-\frac{1}{k\log k}.$$

Let $S_n=\sum_{k=1}^nX_n$. Show that $\dfrac{S_n}{n}$ converges to 0 in probability but not almost surely.


My attempt: For given $\epsilon>0$, $$\begin{align*}\mathbf{P}(|S_n|\geq n\epsilon) & \leq \frac{\text{Var}(S_n)}{n^2\epsilon^2}=\frac{1}{n^2\epsilon^2}\sum_{k=1}^n\text{Var}(X_k)\end{align*}\\ =\frac{1}{n^2\epsilon^2}\sum_{k=2}^n \frac{k}{\log k}\leq \frac{1}{n^2\epsilon^2}(n-1)\frac{n}{\log n}\to0$$ as $n\to\infty$. This proves that $\dfrac{S_n}{n}\to 0$ in probability.

However, I'm stuck with proving that $\dfrac{S_n}{n}$ does not converge to 0 a.s. I'd like to apply Borel-Cantelli lemma here, but each $S_n$ is not independent, so the fact that $\sum_{k=1}^\infty \frac{1}{k\log k}=\infty$ cannot tell anything about convergence. Does anyone have ideas?

Thanks in advance!

Best Answer

If $\frac {S_n} n$ converges a.s. it can only converge to $0$ a.s.. In this case $\frac {X_n} n =\frac {S_n-S_{n-1}} n \to 0$ a.s. But $P\{X_n =n\ i.o. \}=1$ because $ \sum P\{X_n =n \} =\sum \frac 1 {n log \, n} =\infty$ and the events here are independent.

Related Question