[Math] Showing that the series of random variables $X_n / n$ converges almost surely under some conditions

probabilityprobability theory

I'm working on a problem from Chow and Teicher's book on Probability Theory, page 123, #6(ii):

If $X_n, n\geq 1$ are i.i.d., $\mathcal{L_1}$ r.v.s, then $\sum (X_n / n)$ converges a.c. if $E|X_1|log^+ |X_1| <\infty$ and $EX_1 = 0$.

The most relevant theorem that I've been thinking of using is one that says if $X_n$ are i.i.d. and $\mathcal{L_p}$ for $0<p<2$, then $\sum \left(X_n / n^{1/p} – E\left(\dfrac{X_nI_{\{|X_n|\leq n^{1/p}\}}}{n^{1/p}}\right)\right)$ converges a.c.; since, in this case, $p =1$, it would suffice to show that the series $\sum E\left(\dfrac{X_nI_{\{|X_n|\leq n\}}}{n}\right)$ converges a.c. to complete the exercise. However, I'm having trouble incorporating the $E|X_1|log^+ |X_1| <\infty$ condition. I see how $EX_1 = 0$ implies that the summands of the series satisfy the following: $E\left(\dfrac{X_nI_{\{|X_n|\leq n\}}}{n}\right) = -E\left(\dfrac{X_nI_{\{|X_n|> n\}}}{n}\right)$, but unfortunately I've been thus far unable get anything resembling a logarithmic series by manipulating summands here.

Perhaps I'm missing something obvious? Any help is greatly appreciated.

Best Answer

Actually, there is a more elementary proof. We can indeed conclude once we show the convergence of $\sum_{n\geqslant 1}n^{-1}\mathbb E[|X_0|\chi_{\{|X_0|\geqslant n\}}]$. Define $A_k:=\{k\leqslant |X_0|\lt k+1\}$. Then,

\begin{align} \sum_{n\geqslant 1}n^{-1}\mathbb E[|X_0|\chi_{\{|X_0|\geqslant n\}}]&= \sum_{n\geqslant 1}n^{-1}\sum_{k\geqslant n}\mathbb E[|X_0|\chi_{A_k}]\\ &=\sum_{k\geqslant 1}\sum_{n=1}^kn^{-1}\mathbb E[|X_0|\chi_{A_k}]\\ &\leqslant \sum_{k\geqslant 1}\frac 1{\log k}\mathbb E[|X_0|\cdot \log^+|X_0|\chi_{A_k}]\sum_{n=1}^kn^{-1}\\ &\leqslant \sum_{k\geqslant 1}\mathbb E[|X_0|\cdot \log^+|X_0|\chi_{A_k}], \end{align} and this series is convergent because $|X_0|\log^+|X_0|$ is integrable.

Related Question