[Math] How to obtain tail bounds for a square of sub-Gaussian random variable

probability theory

A zero mean sub-Gaussian random variable $Z$ satisfies ${\mathbb E} \exp(tZ) \leq \exp(t^2\sigma^2/2)$ for some constant $\sigma > 0$. This bound can be used together with the Chernoff bound to obtain a two sided tail bound.

I am interested in obtaining exponential tail bounds on $\mathbb{P}[Z^2 – \mathbb{E}Z^2 > z]
$ and $\mathbb{P}[\mathbb{E}Z^2 – Z^2 > z]$. One difficulty is in obtaining an upper bound on $\mathbb{E}\exp[t(Z^2 – \mathbb{E}Z^2)]$. Any pointers to relevant literature? This seems like a result that should be known.

If $Z$ is a standard Gaussian random variable, then $Z^2$ is distributed according to the central chi-squared distribution and the above probabilities can be bounded as in Laurent and Massart (2000) — Adaptive estimation of a quadratic functional by model selection.

Best Answer

If $Z$ is subgaussian, then $Z^2$ is known (somewhat confusingly) as a subexponential random variable. (I say "somewhat confusingly" since in other subfields of probability theory that term means something else entirely).

Have you looked through Roman Vershynin's notes on nonasymptotic random matrix theory? Here's the paper:

R. Vershynin, Introduction to the non-asymptotic analysis of random matrices, Aug. 2010.

There are also some lecture notes that are older on which I believe this paper was based. One version of them can be found here.

This may not contain exactly what you're looking for, but I suspect they'll be helpful.

You might also look through the publications of Mark Rudelson, who also works on topics related to the types of concentration inequalities you are interested in.