[Math] Showing almost sure divergence

measure-theoryprobability theory

Doing some exercises as preparation for an upcoming exam, but im sort of stuck at this exercise: \

Assume that $X_1,X_2,…$ is an i.i.d sequence, such that $X_1 \sim \mathcal{N} (\xi , \sigma^2)$, with $\xi>0$ . Define
$$
S_n=\sum_{k=1}^n \frac{X_k}{k}
$$
Show that $X_n \to \infty $ almost surely as n tends to infinity.

Im not aware of any theorem that can ease my way of showing this quickly.
But im trying something going this direction:
$$
X_n \stackrel{a.s.}{\to} \infty \iff \forall \varepsilon>0 : P(S_n>\varepsilon \quad evt.)=1 \iff \forall \varepsilon>0: P( S_n \leq \varepsilon \quad i.o.)=0
$$
$$
\Leftarrow \forall \varepsilon>0 : \sum_{k=1}^\infty P(S_n \leq \varepsilon) <\infty
$$
And from here i dont know where to go, because i have no closed form expression of the above probabilities, or even any idea if it holds.

Any tips/tricks/solutions are welcome.

Best Answer

Hint: $X_k/k \sim \mathcal{N}(\xi/k, \sigma^2/k^2)$. Then, $S_n \sim \mathcal{N}(\xi\sum_{k=1}^n\frac{1}{k},\sigma^2 \sum_{k=1}^n \frac{1}{k^2})$. The mean grows to infinity, but the variance is at most $\sigma^2 \frac{\pi^2}{6}$.

Related Question