Probability – Expectation of the Maximum of Gaussian Random Variables

normal distributionorder-statisticsprobability

Is there an exact or good approximate expression for the expectation, variance or other moments of the maximum of $n$ independent, identically distributed gaussian random variables where $n$ is large?

If $F$ is the cumulative distribution function for a standard gaussian and $f$ is the probability density function, then the CDF for the maximum is (from the study of order statistics) given by

$$F_{\rm max}(x) = F(x)^n$$

and the PDF is

$$f_{\rm max}(x) = n F(x)^{n-1} f(x)$$

so it's certainly possible to write down integrals which evaluate to the expectation and other moments, but it's not pretty. My intuition tells me that the expectation of the maximum would be proportional to $\log n$, although I don't see how to go about proving this.

Best Answer

How precise an answer are you looking for? Giving (upper) bounds on the maximum of i.i.d Gaussians is easier than precisely characterizing its moments. Here is one way to go about this (another would be to combine a tail bound on Gaussian RVs with a union bound).

Let $X_i$ for $i = 1,\ldots,n$ be i.i.d $\mathcal{N}(0,\sigma^2)$.

Defining, $$ Z = [\max_{i} X_i] $$

By Jensen's inequality,

$$\exp \{t\mathbb{E}[ Z] \} \leq \mathbb{E} \exp \{tZ\} = \mathbb{E} \max_i \exp \{tX_i\} \leq \sum_{i = 1}^n \mathbb{E} [\exp \{tX_i\}] = n \exp \{t^2 \sigma^2/2 \}$$

where the last equality follows from the definition of the Gaussian moment generating function (a bound for sub-Gaussian random variables also follows by this same argument).

Rewriting this,

$$\mathbb{E}[Z] \leq \frac{\log n}{t} + \frac{t \sigma^2}{2} $$

Now, set $t = \frac{\sqrt{2 \log n}}{\sigma}$ to get

$$\mathbb{E}[Z] \leq \sigma \sqrt{ 2 \log n} $$

Related Question