Bounded Random Variables are Subgaussian

probabilityprobability theory

I am looking at a proof that states bounded random variables are subgaussian. I have been able to follow most of it but am missing the following line.

We are letting $X$ be a bounded random variable and $X^*$ is an independent exact copy.

Then

$$E_X[\text{exp}(tX)]=E_X[\text{exp}(t(X-E[X^*])]$$

This I can follow since X is mean zero and I am just adding a constant so does not require joint. The next step is where I am confused

$$E_X[\text{exp}(t(X-E[X^*])]\le E_{X,X^*}[\text{exp}(t(X-X^{*})]$$ by jensens inequality and the convexity of exp.

Best Answer

  • Jensen's inequality implies $e^{-t E[X^*]} \le E_{X^*}[e^{-t X^*}]$
  • $E_X[e^{t(X-E[X^*])}] = E_X[e^{tX}] e^{-tE[X^*]} \le E_X[e^{tX}]E_{X^*}[e^{-tX^*}]$
Related Question