[Math] Tail bounds for maximum of sub-Gaussian random variables

concentration-of-measuremoment-generating-functionsnormal distributionprobability theorystatistics

I have a question similar to this one, but am considering sub-Guassian random variables instead of Gaussian. Let $X_1,\ldots,X_n$ be centered $1$-sub-Gaussian random variables (i.e. $\mathbb{E} e^{\lambda X_i} \le e^{\lambda^2 /2}$), not necessarily independent. I am familiar with the bound $\mathbb{E} \max_i |X_i| \le \sqrt{2 \log (2n)}$, but am looking for an outline of a tail bound for the maximum.

A union bound would give
$$\mathbb{P}(\max_i |X_i| > t) \le \sum_i \mathbb{P}(|X_i| > t) \le 2n e^{-t^2/2},$$
but I am looking for a proof of something of the form
$$\mathbb{P}(\max_i |X_i| > \sqrt{2 \log (2n)} + t)
\le \mathbb{P}(\max_i |X_i| > \mathbb{E} \max_i |X_i| + t)
\le 2e^{-t^2/2}.$$
Does anyone have any hints?

Best Answer

I'm not sure but I think this works. Applying the union bound directly gives. \begin{align} \mathbb{P}(\max_i |X_i| > \sqrt{2\log(2n)} + t) &\le 2n \exp\left( -(\sqrt{2\log(2n)}+t)^2/2 \right)\\ &= 2n \exp(-\log (2n) - t\sqrt{2 \log(2n)} - t^2/2)\\ &\le e^{-t\sqrt{2 \log(2n)}} e^{-t^2/2}\\ &\le e^{-t^2/2} \end{align} This is tighter than the bound given in the post I linked in my question, so maybe there I've made a mistake...

Related Question