Probability Theory – $\lim\sup$ of Random Variables Converging to $\infty$

independenceprobabilityprobability theory

Revisiting my Probability Theory book, I found an exercise in the chapter concerning the zero-one-law of Kolmogorov where it says

One proves the following: Let $(X_n)_{n \in \mathbb{N}}$ independent random variables with $\mathbf{P}(X_n = -1)= \mathbf{P}(X_n = 1) = \frac{1}{2}$, let $S_n = X_1 + … + X_n , n \in \mathbb{N}$. Then $\lim\sup _{n \to \infty} S_n = \infty$ a.s.

Why can this be true and – more specifically – why is $\lim\sup _{n \to \infty} S_n = \infty$ almost sure when the individual random variables have equal chance of cancelling out each other?

Best Answer

First, there is no contradiction with the fact that "the random variables have equal chance of cancelling out each other". You don't have that $S_n$ almost surely goes to infinity, but only that its limsup is infinite. This just means that you reach arbitrarily large elements of $\mathbb{N}$. By the way, the same argument will show that $\liminf S_n=-\infty$, so again, there is no contradiction.

Now, first, you show that $P(\limsup S_n=+\infty)$ is 0 or 1. Indeed, the event $\limsup S_n=+\infty$ is the same as the event $\limsup (S_n-S_k)=+\infty$. But $S_n-S_k=X_{k+1}+...+X_{n}$, so this event is measurable with respect to $\mathcal{F}_{k+1}=\sigma (X_{k+1},X_{k+2},...)$. This is true for all $k$, so by independence and by the 0-1 law, you have the conclusion.

Observe then that since the random variables are symmetric, the two events $\limsup S_n=+\infty$ and $\liminf S_n=-\infty$ have the same probability (which is 0 or 1). To conclude, you just need to prove that the random walk is almost surely not bounded. This will mean that $P(\limsup S_n=+\infty \cup \liminf S_n=-\infty)=1$ and will thus give you the result you want.