[Math] Expected value of exponential of maximum of iid exponential random variables

expectationprobability theory

Suppose that we have $X_1,\ldots,X_n$ iid random variables with distribution each $\mathcal{Exp}(1)$. Also we have a random variable $M_n=\max(X_1,X_2,\ldots,X_n)$ (where $n$ is not a random variable) with CDF $F_{M_n}=(1-e^{-x})^n,$ $x\geq 0$. And we are asked to calculate $\mathbb{E}(e^{M_n})$.

So I imagine that because we know the distribution of $M_n$ we will just must to calculate

$$\mathbb{E}(e^{M_n})=\int_0^\infty e^{\max(X_1,\ldots,X_n)}F_{M_n} \, dx$$

but what's the way that we have to handle $e^{\max(X_1,\ldots,X_n)}$.

Best Answer

What you wrote as "the distribution" is actually the CDF $$ F_{M_n}(x) = (1-e^{-x})^n$$ so you can differentiate to get the PDF $$ f_{M_n}(x) = ne^{-x}(1-e^{-x})^{n-1}.$$ So the expectation of $e^{M_n}$ is $$ E(e^{M_n}) = \int_0^\infty e^x ne^{-x}(1-e^{-x})^{n-1}dx\\ = n\int_0^\infty (1-e^{-x})^{n-1} dx $$ A little thought should convince you that this integral diverges, so the expectation is infinite.

Edit

Come to think of it, there's a simpler argument. We obviously have $M_n\ge X_n$ for all $n$ so $E(e^{M_n}) \ge E(e^X)$ and $$ E(e^X) = \int_0^\infty e^x e^{-x} dx = \infty.$$

Edit 2

In response to a comment, this is the moment-generating function $$ M_{M_n}(s) = E(e^{sM_n})$$ for the value $s=1.$ Both the maximum and the underlying exponential have finite moments at the origin and the MGF only becomes infinite when $s\ge 1.$ For $s<1,$ we have $$ E(e^{sM_n}) = \int_0^\infty e^{sx}ne^{-x}(1-e^{-x})^{n-1}dx \\ = n\int_0^1 u^{-s}(1-u)^{n-1} du \\= n\beta(1-s,n) = \frac{\Gamma(1-s)\Gamma(n+1)}{\Gamma(n+1-s)}$$ which diverges like $ \frac{n}{1-s}$ as $s\to 1^-.$