Example of i.d. random variables s.t. $\frac{X_n}{n} \not\rightarrow 0 ~~~a.s.$

convergence-divergenceexamples-counterexamplesprobability distributionsprobability theory

I am looking for a counterexample to the following statement:

Let $(X_n)_{n \in \mathbb{N}}$ be a sequence of identically distributed random variables, then

$$ \frac{X_n}{n} \rightarrow 0 ~~~a.s.$$


The statement is true when the $(X_n)_{n \in \mathbb{N}}$ are i.i.d. and all $X_n \in L^1$.
Also, the statement is true for convergence in probability since

$$\forall \varepsilon > 0: \mathbb{P}\left(\left\vert \frac{X_n}{n} \right\vert > \varepsilon\right) = \mathbb{P}(\vert X_n \vert > \varepsilon n) \rightarrow 0$$

My first try was to modify the usual example of a sequence converging in probability but not almost surely (i.e., typewriter sequence). However, that forces the random variables to have different distributions.

Any tips or pointers would be greatly appreciated.

Best Answer

Consider an i.i.d. sequence $\{X_n\}$. Since $$ \sum_{n\ge 1}\mathsf{P}(|X_1|>n)\ge \int_{1}^{\infty} \mathsf{P}(|X_1|>x)\, dx \ge \mathsf{E}|X_1|-1, $$ and the events $\{|X_n|>n\}$, $n\ge 1$, are independent, the 2nd Borel-Cantelli lemma implies that $$ \mathsf{P}(|X_n|>n\text{ i.o.})=1 $$ whenever $\mathsf{E}|X_1|=\infty$.

Related Question