[Math] Mean and Variance of maximum of random variables

st.statistics

Given a set of random variables $x_1,x_2,…,x_n$, and we know their means and variances $(\mu_1,\sigma_1),(\mu_2,\sigma_2),…,(\mu_n,\sigma_n)$. How to compute mean and variance of the maximum distribution of $x_1,x_2,…,x_n$? We can assume $x_1,x_2,…,x_n$ are gaussian distributions.

Best Answer

For independently distributed $x_i$'s, each with cumulative distribution $$F_i(x_i)=\tfrac{1}{2}+\tfrac{1}{2}\operatorname{Erf}\,[(x_i-\mu_i)/(\sigma_i\sqrt 2],$$ the cumulative distribution of the maximum is given by $$P(\max_i \,x_i<X_{\max})=\prod_{i=1}^n P(x_i<X_{\max})=\prod_{i=1}^n F_i(X_{\max}).$$ For small $n$ you can now calculate moments of $X_{\rm max}$ by integration, $$E(X_{\max}^p)=\int_{-\infty}^\infty x^p\frac{d}{dx}\left(\prod_{i=1}^n F_i(x)\right)\,dx.$$ There is unlikely to be a closed-form answer for arbitrary $n$, in fact, even the $n=2$ integral seems problematic (Mathematica fails to evaluate it). If you take the $\mu_i$'s and $\sigma_i$'s to be the same, then progress can be made, for $n=2$ I find $$E(X_{\max})=\mu+\sigma/\sqrt\pi,\;\operatorname{Var}(X_{\max})=(1-1/\pi) \sigma^2.$$

Perhaps you are satisfied with a large-$n$ approximation. For identical $\mu_i$'s and $\sigma_i$'s it is given by the Fisher–Tippett–Gnedenko theorem, see for example this MSE posting. I have found one paper that generalizes this to arbitrary $\mu_i$'s and $\sigma_i$'s: On the distribution of the maximum of n independent normal random variables: iid and inid cases, but I have difficulty parsing their result (a rescaled Gumbel distribution).

There is more in that reference that I do not understand. They give the inequality $$\frac{1}{n}\sum_i\mu_i\leq E(X_{\rm max})\leq \frac{1}{n}\sum_i\mu_i+\frac{n-1}{n} \max_i\,\mu_i$$ which contradicts the $n=2$ result given above.

Related Question