Almost Sure Convergence of a Series of Random Variables

measure-theoryprobabilityprobability theorystatistics

Let $\psi(x) = x^2$ when $|x| \leq 1$ and $\psi(x) = |x|$ when $|x| \geq 1$. Show that if $X_1, X_2, \dots$ are independent with $\mathbb{E} X_n = 0$ and $\sum_{n=1}^\infty \mathbb{E} \psi(X_n) < \infty$, then $\sum_{n=1}^\infty X_n$ converges a.s.

This is Durrett exercise 2.5.6.

My thoughts:

We know by Kolmogorov's two-series test that if $\sum_{n=1}^\infty \text{Var}X_n < \infty$ that the series converges almost surely, and here $\psi$ is in some sense a "pseudo variance", but I'm not sure where to get started. Any help would be much appreciated.

Best Answer

Let $Y_n=X_n I_{|X_n| \leq 1}$. First observe that $\sum EY_n^{2} <\infty$ and $\sum E|X_n| I_{|X_n| >1} <\infty$. Using this second property we see that $\sum P(|X_n| >1) <\infty$. By Borel Cantelli Lemma $|X_n| \leq 1$ for all $n$ sufficiently large with probability $1$. Now Note that $\sum var(Y_n) <\infty$ so $\sum Y_n$ converges almost surely. Combining these two facts we see that $\sum X_n$ converges almost surely.

Related Question