[Math] Proving almost sure convergence

almost-everywhereconvergence-divergenceindependencelaw-of-large-numbersprobability theory

Assume the sequence of random variables $X_1, X_2, \cdots$ are IID with finite mean and finite variance. Define a random variable:

\begin{align}
Y_n = \frac{X_n}{n}
\end{align}
Show that $Y_n \to 0$ almost surely.

To converge to some value almost surely implies:
\begin{align}
\mathbb{P}\left( \lim_{n \to \infty}{Y_n} = 0\right) = 1
\end{align}
By the way $Y_n$ is defined, this is equivalent to:
\begin{align}
\mathbb{P}\left( \lim_{n\to \infty}{\frac{X_n}{n}} = 0\right) =1
\end{align}

But, the limit of $\frac{1}{n}$ as $n \to \infty$ is "obviously" $0$ and intuitively I would think that if the expectation of the numerator is finite then we can expect some finite number in the numerator. As $n$ runs off to $\infty$ it would just look like some finite number (it doesn't really matter what the number is so long as it is finite) being divided by a number growing larger and larger – approaching $\infty$. So, I would think the $\lim_{n\to \infty} \frac{X_n}{n}$ would behave similarly to $\lim_{n \to \infty} \frac{1}{n}$.

EDIT: but as pointed out in the comments, the limit of $X_n$ is not necessarily $0$ for all possible definitions of $X_n$.

Thank you.

Best Answer

$$\mathbb{E}\left(\sum_{n=1}^\infty Y_n^2\right)<\infty\Rightarrow \mathbb{P}\left(\sum_{n=1}^\infty Y_n^2<\infty\right)=1\Rightarrow \mathbb{P}\left(Y_n\to 0\right)=1.$$

Related Question