What can be concluded about the probability: $\mathbb{P}(\sum\limits_{n\ge1}\frac{X_n}{n} \quad\text{converges})$

probabilityprobability distributionsprobability theoryself-learning

Let $\{X_n\}_{n\ge1}$ be a sequence of i.i.d. random variables with $\mathbb{P}(X_n=1)=\mathbb{P}(X_n=-1)=\frac{1}{2}$. What can be concluded about the probability:
\begin{align}
\mathbb{P}(\sum\limits_{n\ge1}\frac{X_n}{n} \quad\text{converges})?
\end{align}

I believe that I have a solution which doesn't utilize the fact that they are identically distributed at all, so I am wondering if the following is correct?

Note that $\{X_n\}_{n\ge1}$ is an independent sequence of r.v.'s $\implies$ that $\{\frac{X_n}{n}\}_{n\ge1}$ is an independent sequence of r.v.'s. Now note that:
\begin{align}
A:&=\{\sum\limits_{n\ge1}\frac{X_n}{n} \quad\text{converges}\}\\
&=\{\sum\limits_{n\ge k}\frac{X_n}{n} \quad\text{converges}\} \quad\text{for all}\ k\ge 1\\
&\in \sigma(\frac{X_k}{k},\frac{X_{k+1}}{k+1},\frac{X_{k+2}}{k+2},…) \quad\text{for all}\ k\ge 1\\
\end{align}

Hence, $A \in \bigcap\limits_{k\ge1}\sigma(\frac{X_k}{k},\frac{X_{k+1}}{k+1},\frac{X_{k+2}}{k+2},…)=\tau$ and therefore by Kolmogorov's 0-1 law: $\mathbb{P}(A)=0\ \text{or}\ 1 $.

Am I missing something and this solution is incorrect? Otherwise, I don't see a need to even mention the distribution of the $X_n$'s.

Best Answer

With the help from the comments, I will finish this solution below:

Note that for any $A>1$:

$(i)\quad \sum\limits_{n\ge1}\mathbb{P}\big(\frac{|X_n|}{n}>A\big)\le\sum\limits_{n\ge1}\mathbb{P}(|X_n|>A)=\sum\limits_{n\ge1}0=0<\infty$

$(ii)\quad \sum\limits_{n\ge1}\mathbb{E}\big(\frac{X_n}{n}\mathbb{1}_{\{|\frac{X_n}{n}|\le A\}}\big)=\sum\limits_{n\ge1}\mathbb{E}\frac{X_n}{n}=\sum\limits_{n\ge1}0=0<\infty$

$(iii)\quad \text{Note that}: \mathbb{E}\frac{X_n^2}{n^2}=\big(\frac{-1}{n}\big)^2\big(\frac{1}{2}\big)+\big(\frac{1}{n}\big)^2\big(\frac{1}{2}\big)=\frac{2}{2n^2}=\frac{1}{n^2}$

Thus, $\sum\limits_{n\ge1}\text{Var}\big(\frac{X_n}{n}\big)=\sum\limits_{n\ge1}\big(\mathbb{E}\frac{X_n^2}{n^2}-\big(\mathbb{E}\frac{X_n}{n}\big)^2\big)=\sum\limits_{n\ge1}\frac{1}{n^2}<\infty$

And therefore, by Kolmogorov's Three-Series Theorem, we can in fact say that: \begin{align} \mathbb{P}\big(\sum\limits_{n\ge1}\frac{X_n}{n} \quad \text{converges} \big)=1 \end{align}