$\sum_{n=1}^{\infty} X_n$ converges almost surely

convergence-divergenceprobability theoryrandom variablesself-learning

question:

$\{ X_n, n \in N\}$ be sequence of rrvs and suppose that $E(X_n) = 3^{-n}$ and $var (X_N)=2^{-n}$. Prove the series $\sum_{n=1}^{\infty} X_n$ converges almost surely (a.s.)

My solution:

By Chebyshev inequality

$P(|X_n – 3^{-n} | > e) \le 2^{-n} /e^2$ for small enough e>0.

Define $A_n = \{ w: |X_n – 3^{-n} | > e\}$ such that

$lim P(A_n) = lim P(|X_n – 3^{-n} | > e) \le lim (2^{-n} /e^2) \to 0$ as $n \to 0$.

i.e. $lim P(A_n) \le 0 $

since probability is between 0 and 1, $lim P(A_n) = 0$


At this point, I am stack.

$\sum P(A_n) = 0 $ Then,

$\sum |X_n – 3^{-n} = 0 $

….

And I cannot continue to find the convergence a.s.

Best Answer

$EX_n^{2}=\frac 1{2^{n}}+(\frac 1 {3^{n}})^{2}$. Using the inequality $E|X_n| \leq \sqrt {EX_n^{2}}$ you can easily check that $\sum_n E|X_n| <\infty$. But this im plies that $E(\sum_n |X_n|) <\infty$, so $\sum_n |X_n| <\infty$ almost surely. Hence, the series $\sum X_n$ converges absolutely with probability $1$.