Define $S_n:=\sum_{j=1}^nX_j$ and $\varphi_n$ as the characteristic function of $S_n$. Using independence and normal distribution, we have
$$\varphi_n(t)=\exp\left(it\sum_{j=1}^n\mu_j\right)\cdot\exp\left(-\frac{t^2}2\sum_{j=1}^n\sigma_j^2\right).$$
As $S_n$ converges almost surely, we should have that $(\varphi_n(t))_{n\geqslant 1}$ is a convergent sequence for each $t$ to $\varphi(t)$, where $\varphi$ is a continuous function (the characteristic function of the limiting distribution) .
If $\sum_{j\geqslant 1}\sigma_j^2$ was divergent, then we would have $\varphi(t)=0$ if $t\neq 0$ and $\varphi(0)=1$, a contradiction. Define $s_n:=\sum_{j=1}^n\mu_j$. We have that for each $t$, the sequence $(e^{its_n})_{n\geqslant 1}$ is convergent hence $(s_n)_{n\geqslant 1}$ is convergent.
It holds if $E[|\sum X_i|] < \infty$ or $E[\sum |X_i|] < \infty$.
Consider $X_1, X_2, ...$ in $(\Omega, \mathscr F, \mathbb P) = ([0,1], \mathscr B([0,1]), \lambda)$ where
$$X_n = 2^n 1_{A_n} + -2^{n} 1_{B_n} + 01_{A_n^C \cap B_n^C}$$ where $\lambda(A_n) = \frac{1}{2^{n+1}} = \lambda(B_n)$ and $A_i \cap B_j = A_n \cap B_n = A_i \cap A_j = B_i \cap B_j = \emptyset$
We have:
$$\sum_{n=1}^{\infty} X_n < \infty \ \lambda-a.s.$$
$$\sum_{n=1}^{\infty} E[X_n] = \sum_{n=1}^{\infty} 0 = 0$$
But we cannot compute
$$E [\sum_{n=1}^{\infty} X_n ]$$
because we have
$$ E [| \sum_{n=1}^{\infty} X_n |]$$
$$= E [|\sum_{n=1}^{\infty} 2^n 1_{A_n} + -2^{n} 1_{B_n}|] $$
$$= E [\sum_{n=1}^{\infty} |2^n| 1_{A_n} + |-2^{n}| 1_{B_n}] $$
$$= E [\sum_{n=1}^{\infty} (2^n 1_{A_n} + 2^{n} 1_{B_n})] $$
Note that $(2^n 1_{A_n} + 2^{n} 1_{B_n}) \ge 0$. Hence:
$$= \sum_{n=1}^{\infty} E [(2^n 1_{A_n} + 2^{n} 1_{B_n})] $$
$$= \infty$$
I think the above is independent of whether or not the random variables are independent.
Best Answer
We consider $Y_{m}:=\sum_{n=1}^{m}(X_{n}-\mu_{n})$, note that $\left\{Y_{m}\right\}_{m\geq 1}$ is a martingale respect to $\mathcal{F}_{m}:=\sigma(X_{1},\ldots,X_{m})$. Indeed, we have that \begin{align} \mathbb{E}[Y_{m+1}|F_{m}]&=\mathbb{E}[Y_{m}+(X_{m+1}-\mu_{m+1})|\mathcal{F}_{m}]\\ &=\mathbb{E}[Y_{m}|\mathcal{F}_{m}]+\mathbb{E}[X_{m+1}|\mathcal{F}_{m}]-\mathbb{E}[\mu_{m+1}|\mathcal{F}_{m}]\\ &=Y_{m}+\mathbb{E_{m+1}}-\mu_{m+1}\\ &=Y_{m}+\mu_{m+1}-\mu_{m+1}\\ &=Y_{m}. \end{align}
We also have that $\left\{Y_{m}\right\}_{m\geq 1}$ is bounded in $L^{2}$, to prove this fact we use the following result:
In that sense, note that \begin{align} \sum_{m=1}^{\infty}\mathbb{E}[(Y_{m}-Y_{m-1})^{2}] &=\sum_{m=1}^{\infty}\mathbb{E}[(X_{m}-\mu_{m})^{2}] \\ &=\sum_{m=1}^{\infty}\sigma_{m}^{2} <\infty. \end{align} So, $\left\{Y_{m}\right\}_{m\geq 1}$ is bounded in $L^{2}$. But we know that bounded in $L^{2}$ implies uniformly integrable (UI), then $\left\{Y_{m}\right\}_{m\geq 1}$ is uniformly integrable (UI). Therefore, $\lim_{m\rightarrow\infty}Y_{m}$ exists a.s, that is, $\sum_{n\geq 1} X_n <\infty$ a.s.