A.S. convergence of sum of square-integrable independent random variables with summable variation

borel-cantelli-lemmasconvergence-divergenceprobabilityprobability theoryreal-analysis

I'm working on the following exercise from Achim Klenke's "Probability Theory: A Comprehensive Course" (exercise 6.1.4):

Let $X_1, X_2, \ldots$ be independent, square integrable, centered random variables with $\sum_{i=1}^\infty \mathbf{Var}[X_i] < \infty$. Show that there exists a square integrable $X$ with $X = \lim_{n \to \infty} \sum_{i=1}^n X_i$ almost surely.

Chebyshev's inequality gives us
$$
\mathbf P\left[|S_m – S_n| > \epsilon\right] \leq \epsilon^{-2} \mathbf{Var}\left[ \sum_{i=m+1}^n X_i\right] = \epsilon^{-2} \sum_{i=m+1}^n \mathbf{Var}\left[X_i\right] \xrightarrow{m,n \to \infty} 0.
$$

whence $(S_n)_{n \in \mathbb N}$ is a Cauchy sequence in probability. Thus $S_n \xrightarrow{\mathbf P} X$. Using a similar strategy, we can in fact show that $S_n \to X$ in $L^2$.

Now, to prove almost sure convergence, I'd like to use the following result (Corollary 6.13 in Klenke):

Let $(E,d)$ be a separable metric space. Let $f, f_1, f_2, \ldots$ be measurable maps $\Omega \to E$. Then the following statements are equivalent.

(i)$\quad f_n \to f$ in measure as $n \to \infty$.

(ii)$\quad$For any subsequence of $(f_n)_{n \in \mathbb N}$, there exists a sub-subsequence that converges to $f$ almost everywhere.

and somehow use the fact that we're working with a sum of centered random variables to show that in fact every subsequence converges a.s. But I'm not sure how to do this since our $X_i$ are not nonnegative. I tried reconstructing the proof of this theorem, but I've only been able to show once again that there are a.e. convergent subsequences.

My other thought was to apply the Borel-Cantelli lemma to the events $B_n(\epsilon) := \left\{ |X – S_n| > \epsilon\right\}$ and prove that $\limsup_{n \to \infty} B_n(\epsilon) =: B(\epsilon)$ has probability $0$, but in the latter case I don't know how to approximate the probability of $B_n(\epsilon)$. Chebyshev doesn't seem available to us since strictly speaking we don't know what $X$ looks like, only that $S_n$ converges in $L^2$ to it. Even if we could say $X – S_n = \sum_{i=n+1}^\infty X_i$, the above approximation using Chebyshev with $|X – S_n|$ instead of $|S_m – S_n|$ would work out to
$$
\mathbf P\left[|X – S_n| > \epsilon\right] \leq \epsilon^{-2} \sum_{i=n+1}^\infty \mathbf{Var}[X_i]
$$

which would sum to $\epsilon^{-2} \sum_{n=1}^\infty n\mathbf{Var}[X_n]$, but I don't see why this series converges.

Any thoughts on how to prove $S_n \to X$ almost surely?

Best Answer

Since $$ \lim_{n\to\infty}\mathsf{P}\!\left(\sup_{k\ge n}|S_n-S_k|> \epsilon\right)=0, $$ the set on which the sequence $\{S_n\}$ is not Cauchy, $$ N=\bigcup_{\epsilon>0}\bigcap_{n\ge 1}\left\{\sup_{j,k\ge n}|S_j-S_k|>\epsilon\right\} $$ is a null set ($\because \sup_{j,k\ge n}|S_j-S_k|\le 2\sup_{k\ge n}|S_n-S_k|$). So you define $X:=\lim_{n\to\infty} S_n1_{N^c}$.