Proving a sequence converges almost surely given some conditions

convergence-divergenceinequalityprobabilityprobability theoryrandom variables

Let $X_{1}, X_{2}, \ldots$ be a sequence of i.i.d. random variables
for which $E(X_i) = 0$ and $E(X_i^2) = V < \infty$. Prove that for any
$\gamma> 1/2$, the series $\sum_{i\geq 1} X_i/i^{\gamma}$ converges
almost surely.

I am not so sure how to approach this problem, and I would really appreciate your help. I have the following Theorem which may be useful:

Three Series Theorem: Let $X_{i}$ be a sequence of independent random
variables. If for some $c > 0$ each of the three series converges,
then the series $\sum_{i=1}^{\infty} X_{i}$ converges almost surely.

1) $\sum_{i=1}^{\infty} E(X_i^{c})$

2) $\sum_{i=1}^{\infty} \text{Var}(X_i^{c})$

3) $\sum_{i=1}^{\infty} P(|X_i| \geq c)$

But I have tried with this theorem for long with no luck. I would greatly appreciate your help in solving this problem

Best Answer

You're better off using the two series theorem : If $Y_n$ is a sequence of independent random variables such that $\sum_{n=1}^\infty E[Y_n]$ and $\sum_{n=1}^\infty Var[Y_n]$ are both finite, then $\sum_{n=1}^\infty Y_n$ converges almost surely.

In our case, $Y_n = \frac{X_n}{n^\gamma}$. We have $E[Y_n] = 0$ so the sum is of course convergent, and $Var[Y_n] = \frac{V}{n^{2 \gamma}}$. But since $\gamma > \frac 12$ we have $2 \gamma > 1$ so $\sum_{i=1}^\infty \frac{V}{i^{2 \gamma}} < \infty$. Therefore the variance sum is also finite.

Now the result follows. Note that identically distributed was not required. Also, Kolmogorov's two series theorem is usually discussed before the three series theorem, which is the one you have referred to in your question.

Related Question