[Math] Expected maximum of sub-Gaussian

gauss-sumsprobabilityprobability theoryrandom variables

I'm trying to answer the following question from the book high-dimensional probability:

Let $X_1,X_2,\dots$ be a sequence of sub-gaussian random variables, which are not necessarily independent. Show that

$E\bigg[ \max_i \frac{|X_i|}{\sqrt{1 + \log i}} \bigg] \le CK$,

where $K = \max_i \|X_i\|_{\psi_2}$. Deduce that for ever $N \ge 2$ we have

$E\bigg[ \max_{i \le N} |X_i| \bigg] \le CK \sqrt{\log N}$.

I've tried to figure out what is the distribution of the maximum of Gaussians, but I'm reaching only inequalities that that don't help me answer the question.

I've also seen a similar question here.

Does anyone have a clue or something to start with in order to answer this question?

Thanks!

Best Answer

You can use this idea as a start (it is actually more that a start!) Without loss of generality, assume that $K = c$ (the constant in the exponent of subgaussian tail).

\begin{eqnarray} \mathbb{E}\max \frac{|X_i|}{\sqrt{1+\log i}} &=& \int_0^\infty \mathbb{P}\left(\max \frac{|X_i|}{\sqrt{1+\log i}} > t \right) dt\\ &\leq& \int_0^2 \mathbb{P}\left(\max \frac{|X_i|}{\sqrt{1+\log i}} > t \right) dt + \int_2^\infty \mathbb{P}\left(\max \frac{|X_i|}{\sqrt{1+\log i}} > t \right) dt \\&\leq& 2 + \int_2^\infty \sum_{i=1}^N\mathbb{P}\left( \frac{|X_i|}{\sqrt{1+\log i}} > t \right) dt \\ &\leq& 2 + \int_2^\infty \sum_{i=1}^N 2 \exp\big(-\frac{ct^2(1+\log(i))}{K^2} \big) dt\\ &\leq& 2 + 2\sum_{i=1}^N \int_2^\infty \exp(-ct^2/K^2) \;\;i^{-t^2} dt \\ &\leq& 2 + 2\sqrt{2\pi }K\sum_{i=1}^N \int_2^\infty \frac{1}{\sqrt{2\pi }K}\exp(-\frac{ct^2}{K}) \;\;i^{-4} dt \leq \infty \end{eqnarray} We know that the sum of $\frac{1}{i^4}$ in convergent.

I choose 2 as the point to split two integrals to make the sum convergent. (you could have used other points).