Probability Theory – Almost Sure Convergence of Sum of Normal Random Variables

convergence-divergencelaw-of-large-numbersprobability theory

From Resnick's A Probability Path, Exercise 7.7.14:

Suppose $\{X_n, n \ge 1\}$ are independent, normally distributed with $E(X_n) = \mu_n$
and Var$(X_n)=\sigma^2_n$. Show that $\sum_n X_n$ converges almost surely iff $\sum_n \mu_n$ converges and $\sum_n \sigma^2_n < \infty$.

I think I got the sufficiency portion of the proof:

Suppose $\sum_n \mu_n$ converges and $\sum_n \sigma^2_n < \infty$.

Then $\sum_{j=1}^\infty Var(X_j) = \sum_{j=1}^\infty \sigma^2_j < \infty$, and so by the Kolmogorov Convergence Criterion, $\sum_{j=1}^\infty (X_j-E(X_j)) = \sum_{j=1}^\infty (X_j)-\sum_{j=1}^\infty E(X_j))$ converges almost surely, so $\sum_{j=1}^\infty (X_j)$
converges almost surely.

Going the other direction is what is giving me difficulty:

Suppose $\sum_n X_n$ converges a.s.

Then we can write $\sum_{j=1}^\infty X_n$ converges almost surely. I know that the this sum of normal random variables would be $N(\sum_{j=1}^\infty \mu_j ,\sum_{j=1}^\infty \sigma^2_j)$. So I know there is a subtlety I am missing here. Do I know that the normal random variable that is the almost sure limit necessarily has finite mean and variance? If so, is that enough for the proof?

Best Answer

Define $S_n:=\sum_{j=1}^nX_j$ and $\varphi_n$ as the characteristic function of $S_n$. Using independence and normal distribution, we have $$\varphi_n(t)=\exp\left(it\sum_{j=1}^n\mu_j\right)\cdot\exp\left(-\frac{t^2}2\sum_{j=1}^n\sigma_j^2\right).$$ As $S_n$ converges almost surely, we should have that $(\varphi_n(t))_{n\geqslant 1}$ is a convergent sequence for each $t$ to $\varphi(t)$, where $\varphi$ is a continuous function (the characteristic function of the limiting distribution) .

If $\sum_{j\geqslant 1}\sigma_j^2$ was divergent, then we would have $\varphi(t)=0$ if $t\neq 0$ and $\varphi(0)=1$, a contradiction. Define $s_n:=\sum_{j=1}^n\mu_j$. We have that for each $t$, the sequence $(e^{its_n})_{n\geqslant 1}$ is convergent hence $(s_n)_{n\geqslant 1}$ is convergent.