[Math] convergence of expectation sum of infinite random variables

convergence-divergenceexpectationindependencemeasure-theoryprobability theory

If $X_n$ are independent random variables such that $\sum \mathbb{E} X_n$ exists,
and that $\sum X_n$ converges a.s. (almost surely), must it be that $\mathbb{E} \sum X_n = \sum
\mathbb{E} X_n$?

If $X_n \ge 0$, then this is obvious by using monotone convergence theorem since $\sum_{n=1}^{N} X_n$ is non-decreasing. Also, we know the following three things from $\sum X_n$ converges a.s.

1).$\sum\mathbb{P}(|X_n|>1)<\infty$

2).$\sum\mathbb{E}Y_n<\infty$, where $Y_n=X_n\mathbb{1}_{|X_n|\le 1}$

3).$\sum Var(Y_n)<\infty$

I am stuck on this problem and don't know how to proceed. Any help appreciated.

Best Answer

It holds if $E[|\sum X_i|] < \infty$ or $E[\sum |X_i|] < \infty$.


Consider $X_1, X_2, ...$ in $(\Omega, \mathscr F, \mathbb P) = ([0,1], \mathscr B([0,1]), \lambda)$ where

$$X_n = 2^n 1_{A_n} + -2^{n} 1_{B_n} + 01_{A_n^C \cap B_n^C}$$ where $\lambda(A_n) = \frac{1}{2^{n+1}} = \lambda(B_n)$ and $A_i \cap B_j = A_n \cap B_n = A_i \cap A_j = B_i \cap B_j = \emptyset$

We have:

$$\sum_{n=1}^{\infty} X_n < \infty \ \lambda-a.s.$$

$$\sum_{n=1}^{\infty} E[X_n] = \sum_{n=1}^{\infty} 0 = 0$$

But we cannot compute

$$E [\sum_{n=1}^{\infty} X_n ]$$

because we have

$$ E [| \sum_{n=1}^{\infty} X_n |]$$

$$= E [|\sum_{n=1}^{\infty} 2^n 1_{A_n} + -2^{n} 1_{B_n}|] $$

$$= E [\sum_{n=1}^{\infty} |2^n| 1_{A_n} + |-2^{n}| 1_{B_n}] $$

$$= E [\sum_{n=1}^{\infty} (2^n 1_{A_n} + 2^{n} 1_{B_n})] $$

Note that $(2^n 1_{A_n} + 2^{n} 1_{B_n}) \ge 0$. Hence:

$$= \sum_{n=1}^{\infty} E [(2^n 1_{A_n} + 2^{n} 1_{B_n})] $$

$$= \infty$$


I think the above is independent of whether or not the random variables are independent.