A wrong corollary of Kolmogorov’s $0-1$ Law

probability theory

I either broke Probability Theory, or what is more likely, I am confused.

Let $X_{n}$ be a sequence of i.i.d random variables.

Then by Kolmogorov's $0-1$ Law, the probability that $S_{n}=\sum_{k=1}^n X_{k}$ converges is $\in \{0,1\}$.

Furthermore, the random variables $\lim \sup S_{n}$ and $\lim \inf S_{n}$ are measurable with respect to the terminal sigma-field generated by the $X_{n}$ and hence a.s. constant (a consequence of the 0-1 law) – is this true?

Now if $S_{n}$ converges a.s. then $\lim \sup S_{n} = \lim \inf S_{n}$ a.s.

Does this mean that if $S_{n}$ converges it does so to a constant?

It can't…

Best Answer

By the Hewitt–Savage 0-1 law, $\limsup S_n$ is a constant $C\in[-\infty,\infty]$ ($\because \limsup S_n$ is measurable w.r.t. the exchangeable $\sigma$-algebra). Since $S_n\overset{d}{=} S_{n+1}-X_1$, $C=C-X_1$ a.s. There are two possibilities for the last equality to hold: (1) $|C|<\infty$ and $X_1=0$ a.s., (2) $C=\pm\infty$. The same is true for $\liminf S_n$.