Upper bound on the expectation of the min max of gaussian random variables

expected valuenormal distributionorder-statisticsprobabilitystatistics

Define a random $n \times m$ matrix X, where each entry is independent and follows a standard gaussian distribution, i.e., $X_{ij} \sim N(0,1)$, i.i.d., $\forall$ $i,j$. My objective is to find an upper bound (as a function of m and n) for $$\mathbb{E}\left[\min_{1 \leq j \leq m} \max_{1 \leq i \leq n}X_{ij} \right ]$$

I know that the expectation of the maximum has an upper bound of order $\sqrt{\log n}$, and in numerical simulations, I can see that when $m=n$ this expectation diverges to infinity as well, but I am struggling to find a sharp upper bound.

Best Answer

Probably the most trivial upper bound is that

$$\mathsf E \min_{j\le m}\max_{i\le n} X_{ij}\lesssim\sqrt{\log n}.$$

In fact, this is the best you can hope for as long as $m$ is not too large compared $n$ (in particular, this holds if $m=n$). The reason is simply that the maximum of $n$ Gaussians is strongly concentrated around its expectation. More quantitatively, note that for small $\varepsilon>0$,

$$P\left(\max_{i\le n} X_{i1}\le \sqrt{\varepsilon\log n}\right)\lesssim (1-n^{-\varepsilon/2})^n\sim e^{-n^{1-\varepsilon/2}}.$$

So imagine for example that $m\le e^{n^{0.99}}$ and take a small enough constant $\varepsilon>0$. By a union bound, with probability $\approx 1-e^{-n^{1-\varepsilon/2}}$, there holds that for all $j\in[m]$, $\max_{i\le n} X_{ij}>\sqrt{\varepsilon\log n}$. This is more than enough to ensure that $\mathsf E \min_{j\le m}\max_{i\le n}\gtrsim\sqrt{\log n}$.


Of course, we should not expect this upper bound to stay tight when $m$ is crazily large compared to $n$, but that already seems to cover the regime you are interested in. Note that I was not very rigorous with the asymptotics here, but I'm sure everything can be made formal.