Sub-gaussian norm vs. variance proxy

distribution-tailsmoment-generating-functionsprobabilityprobability theory

When studying sub-gaussian variables, I have come across several definitions. One of the most common uses the concept of a "variance proxy," i.e. a (mean-zero) random variable $X$ is sub-gaussian with variance proxy $\sigma^2$ if
$$ \mathbb{E}\exp(sX) \leq \exp \left(\frac{\sigma^2s^2}{2}\right)$$
for all $s \in \mathbb{R}$. Now, in other texts, e.g. Roman Vershynin's "High-dimensional Probability," sub-gaussian random variables are introduced alongside a kind of Orlicz norm, namely
$$ \|X\|_{\psi_2} = \inf\left\{k>0 \vert \mathbb{E}\exp\left(\frac{X^2}{k^2} \right) \leq 2 \right\}$$
which then satisfies a set of equivalent properties characterizing sub-gaussianity, including
$$ \mathbb{E}\exp(\lambda X) \leq \exp(C \lambda^2 \|X\|_{\psi_2}^2)$$
for all $\lambda \in \mathbb{R}$. Is there some explicit relation between the sub-gaussian norm and the variance proxy? Or are they perhaps even the same thing? If not, in which scenarios may they be equal?

Best Answer

One has to start with the integral representation $$e^{\frac{X^2}{2r^2}}=\int_{-\infty}^{\infty}e^{sX-\frac{r^2s^2}{2}}\frac{r}{\sqrt{2\pi}}ds.$$ Therefore if $E(e^{sX})\leq e^{\frac{\sigma^2 s^2}{2}}$ we get $E(e^{\frac{X^2}{2r^2}})\leq \frac{r}{\sqrt{r^2-\sigma^2}}.$