[Math] Convergence of sum of random variables

convergence-divergenceprobability theoryrandom variables

Let $X_n$, $n\geq 0$, be i.i.d. random variables such that: $\mathbb E(X_1)=0$, and $0<\mathbb E(|X_1|^2)<\infty$. Given that $\alpha >\frac{1}{2}$, I need to show that $$S_n=\sum_{k=1}^{n}\frac{X_k}{k^\alpha}$$

Converges almost surely.

Can I have a hint for how to do this?


To show that $$S_n=\sum_{k=1}^{n}\frac{X_k}{k^\alpha}$$

converges, we use Kolmogorov's Three Series Theorem, which implies that we have to show three conditions are satisfied:

a.) For some $A>0$, $\displaystyle \sum_{k=1}^{\infty}\mathbb P\left(\left|\frac{X_k}{k^{\alpha}}\right|\geq A\right)$ converges

b.) Let $\displaystyle Y_{k}:=\frac{X_k}{k^{\alpha}}1_{\{\left|\frac{X_{k}}{k^{\alpha}}\right|\leq A\}}$, Then $\displaystyle \sum_{k=1}^{\infty}\mathbb E[Y_{k}]$ converges;

c.) $\displaystyle \sum_{k=1}^{\infty}\mathrm{Var}(Y_{k})$ converges.

Best Answer

This theorem is easier.

Kolmogorov one Series Theorem: Suppose $X_n$ is a sequence of independent real random variables. If $\sum Var(X_n)<\infty$ then $\sum X_n-E(X_n)$ converges almost surely. In particular in your case $\sum X_n$ converges almost surely. It's direct because $ \sum Var (\frac{X_k}{k^{\alpha}})=\sum \frac{Var(X_k)}{k^{2\alpha}}=\sum \frac{Var(X_1)}{k^{2\alpha}}$ And since $2\alpha >1$ the series converges.

Related Question