[Math] Sums of independent random variables converging almost surely

convergence-divergencemeasure-theoryprobability theoryrandom variablessequences-and-series

I am working through Achim Klenke's text entitled "Probability Theory", and I came across the following interesting exercise:

Let $(X_i)_{i\in\mathbb{N}}$ be independent, square-integrable random variables with $\mathbb{E}(X_i)=0$ for all $i$. Suppose that $\sum_{i=1}^\infty \mathbb{E}(X_i^2)<\infty$. Conclude that there exists a real random variable $X$ with $\sum_{i=1}^n X_i \xrightarrow{n\to\infty} X$ almost surely.

I attempted to prove this via Borel-Cantelli, namely, I tried to show that $\mathbb{P}(\{\omega:\sum_{i=n}^\infty X_i(\omega)\xrightarrow{n\to\infty}0 \})=1$, since the sequence will be summable if and only if the remainders are going to zero. In the details of B-C, though, for a fixed $\epsilon>0$ this requires showing that $\mathbb{P}(|\sum_{i=n}^\infty X_i| > \epsilon \;\;\;i.o.) =0$. An application of Chebyshev's inequality and using independence then gives

$$\mathbb{P}\left(\left|\sum_{i=n}^\infty X_i\right|>\epsilon\right) \leq \frac{1}{\epsilon^2}\sum_{i=n}^\infty \mathbb{E}(X_i^2)<\infty.$$ But now we certainly need not have that this is summable over all $n$ (take $X_i$ to be Bernoulli with possible values $\pm 1/i$).

I imagine my choice of Chebyshev's wasn't strong enough, or the entire approach is off. Suggestions?

Best Answer

We can use this answer to see that we just have to check convergence in probability, what it's done in the OP.

Related Question