$\frac{1}{n}\sum_{k=1}^n(X_k-\mathbb E[X_k])$ converges a.s. to $0$

probabilityprobability theory

Let $X_1,X_2,\dots$ be a sequence of independent random variables, such that the series $$\sum_{n=1}^\infty\frac{\operatorname{Var}(X_n)}{n^2}$$ converges. Show that as $n\to\infty$, $$\frac{1}{n}\sum_{k=1}^n(X_k-\mathbb E[X_k])$$ converges almost surely to $0$.


There is a quick solution via the martingale convergence theorem: we have that $$Y_n=\sum_{k=1}^n\frac{X_k-\mathbb E[X_k]}{k}$$ is martingale, and $\sup_n\mathbb E[|Y_n|]$ is finite, so $Y_n$ converges almost surely to some random variable $Y$, and we can finish with Kronecker's lemma.

I'm interested though in any approaches avoiding the heavy machinery of the martingale convergence theorem. I feel like defining the $Y_n$'s as I did above could be fruitful. For example, the Kolmogorov inequality gives the bound
$$\mathbb P\left(\max_{1\leq i\leq n}|Y_i|>\varepsilon\right)\leq\frac{1}{\varepsilon^2}\mathbb E\left[Y_n^2\right]=\frac{1}{\varepsilon^2}\cdot\sum_{k=1}^n\frac{\operatorname{Var}(X_k)}{k^2}.$$
But I'm unsure what to make from all this.

Best Answer

A proof, which you kind of started, can be derived from Kolmogorov inequality,

For $n < m$, Kolmogorov inequality applied to the martingale $Y_k-Y_n$, $n \leq k \leq m$, gives that, for all $\epsilon > 0$ $$ P(\max_{n \leq k \leq m} |Y_k - Y_n| > \epsilon) \leq \frac{1}{\epsilon^2} \sum_{k = n}^m \frac{Var(X_k)}{k^2}. $$

So, for all $\epsilon > 0$

$$ P(\inf_l \max_{n,m \geq l} |Y_m - Y_n| > \epsilon) \leq \lim_{l \rightarrow \infty} P(\max_{n,m \geq l} |Y_m - Y_n| > \epsilon) = 0. $$

Now take $\epsilon_p \rightarrow 0$. Define $$ \Omega_p = \{ \inf_l \max_{n,m \geq l} |Y_m - Y_n| \leq \epsilon_p \}, $$ and $\Omega' = \bigcap_p \Omega_p$. Then $P(\Omega') = 1$ and, for all $\omega \in \Omega'$, $Y_n(\omega)$ is a Cauchy sequence, therefore converges. Kronecker's Lemma then finishes the proof, as before.

Comment

This does not really "avoid the martingale machinery", though. This argument substitutes martingale convergence by Kolmogorov inequality, which is a maximal inequality for martingales.

It does not seem easy to get away from using the martingale property in one way or another.

The alternative argument proposed by previous answer uses the fact that a series of independent summands that converges in probability must also converge almost surely. The standard proof of this fact is a stopping time argument, that also uses the stronger independence property.