Probability – Complete Convergence Equivalent to Convergence a.s. Under Independence

measure-theoryprobability

$X_1,X_2,\ldots$ is a sequence of random variables that are complete convergent to $X$ if $$\sum_{n=1}^{\infty} P(\mid X_n-X\mid >\epsilon)<\infty \space\forall\epsilon > 0$$. Show if $X_n$ are independent, then complete convergence is equivalent to convergence a.s.

I showed that complete convergence implies convergence a.s. using the Borel-Cantelli lemma, but I'm not sure how to use show the converse using independence. This is what I have so far:

$$X_n\rightarrow_{a.s.} X \implies X_n\rightarrow_{p} 0$$ WLOG, $X=0$.

$$\forall\epsilon >0, P(\mid X_n\mid > \epsilon)\rightarrow 0$$
$$P(\limsup_{n\rightarrow\infty} \mid X_n\mid > \epsilon)=0$$

Since they are independent, then $\{\mid X\mid > \epsilon\}$ are independent, and can I use Borel-Cantelli (ii) to say $\sum P(\mid X_n\mid >\epsilon) < \infty$?

Best Answer

Your idea is fine and there is small step to finish it. I write the proof anyway.


Suppose that you do have the a.s. convergence but not the complete convergence. Therefore you have: $$ \sum_{n=1}^{\infty} P(\mid X_n-X\mid >\epsilon)=\infty \space\exists\epsilon > 0 $$ For such $\epsilon$, you can use second Borel-Cantelli to prove that: $$ P(\limsup_{n\rightarrow\infty} \mid X_n-X\mid > \epsilon)=1 $$ But from a.s. convergence you have $\displaystyle P(\lim_{n\rightarrow\infty} X_n=X)=1$ which means that for all $\epsilon>0$, $\displaystyle P(\limsup_{n\rightarrow\infty} \mid X_n-X\mid > \epsilon)=0$ which is a contradiction. Therefore $X_n$ should have complete convergence.

Related Question