Almost Sure Convergence – Maximum of Sequence of Random Variables

convergence-divergenceprobability theory

Let $X_1, X_2, \dots$ be a sequence of i.i.d random variables from distribution $F$ with exponential tails. Denote $Y_n = \max (X_1, \dots , X_n)$. How can we prove the following:
$$ \lim_{n \rightarrow \infty} \frac{Y_n}{\log n} = c $$
almost surely for some constant $c$.

And also, how can we determine what that value of $c$ is? What if we knew that the distribution $F$ is, say, a Gamma distribution (or another common distribution)?

This result seems standard, as indicated in the question here, but I could not discover how to prove it.

Best Answer

For a standard exponential, we know that $Z_n=Y_n-\log(n)$ converges in distribution to a nondegenerate distribution (a Gumbel), so this means $\frac{Z_n}{\log(n)}$ converges almost surely to zero, which in turn means that $\frac{Y_n}{\log(n)}$ converges almost surely to $1.$

So the almost-sure convergence of something like this follows from the extreme value distribution. In general, for a distribution with an infinite tail that decays faster than a power law, we have that $\frac{Y_n-b_n}{a_n}$ converges in distribution to a Gumbel. For something like a Gamma, with a pure exponential tail $\sim e^{-x/\theta}$, we can work out that we have $a_n=\theta$ and $b_n$ to leading order in $n$ is $\theta\log(n).$ So for a Gamma with PDF $\frac{1}{\Gamma(\alpha) \theta^\alpha}x^{\alpha-1}e^{-x/\theta},$ $\frac{Y_n}{\log(n)}$ converges almost surely to $\theta.$

Related Question