[Math] Convergence in Distribution of the maximum of a sequence.

analysismeasure-theoryprobabilityprobability theory

I've come across this problem which has completely stumped me. It goes as follows:

Let $(X_n)$ be a sequence of independent and identically distributed exponential random variables with parameter $\lambda$. Let $M_n$ denote max{$X_1,…,X_n $}. Show there exists a random variable $Z$ such that $M_n – \frac{1}{\lambda} \log (n) $ converges in distribution to $Z$.

Now this problem seems really hard, so I tried proving it for convergence in probability to see if that would work and imply convergence in distribution, but I couldn't get very far. Moreover, calculating the expectations also seems non-trivial due to the fact that it would be hard to integrate $M_n = \frac{1}{\lambda} \log n$.

Any suggestions and ways to approach this problem would be greatly appreciated as I've been stuck on this for a long time. Should I try using the Skorohod equivalent statement?

Thanks.

Best Answer

Unfortunately we do not have convergence in probability, so that approach is not going to succeed. The basic trick is to note that $M_n\le a$ if and only if $X_i\le a$ for each $i=1,\ldots,n$ and then use independence: $$P(M_n-\tfrac1\lambda\log n\le x)=P(M_n\le\tfrac1\lambda\log n+x)=\prod_{i=1}^nP(X_i\le\tfrac1\lambda\log n+x).$$ Now remembering that $P(X_i\le a)=1-e^{-\lambda a}$, we find $$P(M_n-\tfrac1\lambda\log n\le x)=\left(1-e^{-\lambda(\frac1\lambda\log n+x)}\right)^n=\left(1-\frac{e^{-\lambda x}}n\right)^n\to e^{-e^{-\lambda x}}$$ as $n\to\infty$. Hence $M_n\to Z$ in distribution where $P(Z\le x)=e^{-e^{-\lambda x}}$.

Related Question