Show that $\frac1n\max\limits_{1\le i \le n } X_i\to0$ almost surely, with no independence assumption

convergence-divergenceprobability theoryself-learning

This is self-study, I encountered this problem in one of the previous examination papers.

Let $X_1,X_2,\dots $ be a sequence of identically distributed random variables with $E|X_1| < \infty $ and let $Y_n = \frac1n\max_{1 \le i \le n } X_i$. Show that $Y_n \overset{a.s.}{\to} 0$

This problem is fairly straightforward if it was i.i.d case – one can easily find the distribution of sample maximum and use Borel Cantelli lemma 1 to show that this almost surely happens.

Because this is an examination question, I think, one clue is to use Markov's inequality because of the $E[|X|]$ term.

Any clues are greatly appreciated.

Best Answer

Here is an easy argument.

Wlog assume $X_n\geq 0$ (else replace $X_n$ with $|X_n|$ throughout the proof below).

First note that $X_n/n \to 0$ almost surely, by Borel-Cantelli and the fact that for any $\epsilon>0$ one has $$\sum_n \Bbb P(X_n\geq\epsilon n) = \sum \Bbb P(X_1\geq \epsilon n) =\Bbb E\bigg[ \sum 1_{\{X_1\geq\epsilon n\}} \bigg] \leq 1+\epsilon^{-1}\Bbb E|X_1|.$$

Now if $x_n$ is any (deterministic) sequence of non-negative real numbers such that $x_n/n \to 0$, then it is true that $\frac1n \max_{1\leq i \leq n} x_i \to 0.$ Indeed, this can be proved by noting that for any $N \leq n$ one has $\frac1n \max_{1\leq i \leq n} x_i \leq \frac{1}{n} \max_{1\leq i \leq N} x_i +\max_{N < i \leq n} \frac{x_i}{i}$. Now it's just an elementary real-analysis argument (I leave the details to you).

Independence is not needed.