Variance of maximum

probabilityprobability theory

Let $(X_i)_{i\leq n}$ be random variables, which may be dependent. Is it true that
\begin{align*}
\text{Var}(\max X_i) \leq \sum_{i=1}^n \text{Var}(X_i).
\end{align*}

I have tried integrating the tail probability,
\begin{align*}
\text{Var}(\max X_i) &= E\left( \max X_i – E \max X_i \right)^2 \\
&= \int_0^\infty P\left((\max X_i – E \max X_j)^2 > t \right) dt \\
&\leq \int_0^\infty P\left(\bigcup_{i=1}^n \{(X_i – E \max X_j)^2 > t\} \right) dt \\
&\leq \sum_{i=1}^n \int_0^\infty P((X_i-E \max X_j)^2 > t) dt \\
&= \sum_{i=1}^n E (X_i – E \max X_j)^2,
\end{align*}

but this is not quite what we need. It seems that if the $X_i$ are independent, then I shouldn't have lost anything in the union bound. Does anyone know of a counter example or a proof?

Best Answer

Your solution looks good. One can get the same bound using the Efron-Stein inequality. Using the notation in your answer, \begin{align} \operatorname{Var}(\max_i X_i) &\le \frac{1}{2}\sum_{k=1}^n E[(\max_i X_i - \max\{X_1,\ldots, X_{k-1}, Y_k, X_{k+1}, \ldots, X_n\})^2] \\ &\le \frac{1}{2} \sum_{k=1}^n E[(X_k - Y_k)^2]. \end{align} To justify the last inequality, note that

  • if $\max_i X_i > \max\{X_1,\ldots, X_{k-1}, Y_k, X_{k+1}, \ldots, X_n\}$ then $k = i^*$ (where $X_{i^*} = \max_i X_i$) and thus $X_k = \max_i X_i > \max\{X_1,\ldots, X_{k-1}, Y_k, X_{k+1}, \ldots, X_n\} \ge Y_k$
  • if $\max_i X_i < \max\{X_1,\ldots, X_{k-1}, Y_k, X_{k+1}, \ldots, X_n\}$ then $Y_k = \max\{X_1,\ldots, X_{k-1}, Y_k, X_{k+1}, \ldots, X_n\} \ge \max_i X_i \ge X_k$.
Related Question