[Math] Almost Sure convergence of sum of independent random variables

convergence-divergenceindependenceprobability theory

Let $\{{X_{j}}\}_{1}^{\infty}$ be independent r.v.s such that $\sum E( |X_{j}|) <\infty$. How to show that $\sum X_{j}$ converges almost surely.
Can I argue simply that for every $\epsilon>0, \exists N$ such that $\forall j,k >N, E(|X{j}-X_{k}|)<\epsilon$. Then I proceed exactly as in

how to show convergence in probability imply convergence a.s. in this case?

Best Answer

Here is a simple proof. By monotone convergence theorem: $$ \sum_j E|X_j| = E \big[ \sum_{j} |X_j| \big]. $$ It follows from the assumption that $E \big[ \sum_j |X_j| \big] < \infty$. Any random variable which has finite expectation should be finite almost surely. Thus, $\sum_j |X_j| < \infty$ almost surely. But absolute convergence for series implies convergence, hence $\sum_j X_j$ converges almost surely.