Proof for gaussian random variables

gaussian-measurenormal distributionprobabilityprobability theoryrandom variables

I am trying to prove the KK inequality. I tried to do it on my own but could not manage to work out the full details and unfortunately, it seems like it is not present on the web any proof of the theorem.

In the following, there's the inequality.

Theorem. Consider a family $\left(X_i \right)$ of fixed vectors and $g_1, \ldots, \g_n$ a family of $i.i.d.$ standard Gaussian random variables. Then ($| \cdot |$ indicates the vector norm):

$\operatorname{Var} \left(\| \sum_{i=1}^n g_i X_i \| \right) \leq \left(\mathbb{E}\| \sum_{i=1}^n g_i X_i \|\right)^2$

The idea of the proof (I believe), should be to apply Gaussian's concentration and then show that the Lipschitz constant that comes from this is comparable to the right-hand side (up to constants).

Do you know how you could prove this result?

Observe that this result is actually for more general powers: instead of variance, i.e. p=2 (since the r.v. are centered) one could choose higher exponents and still obtain the same result, but with the variance I believe that it has a more clear meaning and interpretation and that is already an extremely useful result!

Best Answer

Theorem. Let $V$ a centered Gaussian variable in $R^n$ with covariance matrix $\Sigma.$ Let \begin{equation}\label{J}J_n=\frac{1}{2\sqrt{\pi}}\int_0^{\infty}\left(1-\left(1+\frac{2s}{n}\right)^{-n/2}\right)\frac{ds}{s^{3/2}}.\end{equation} Then for all $\Sigma$ \begin{equation}\label{BASE}\frac{E(\|V\|)}{\sqrt{E(\|V\|^2)}}\geq J_n \ \ (A)\end{equation} and equality holds if and only if $\Sigma$ is proportional to the identity matrix $I_n.$ Furthermore $J_n$ increases to 1. In particular $\frac{E(\|V\|)}{\sqrt{E(\|V\|^2)}}\geq J_1=\frac{\sqrt{2}}{\sqrt{\pi}}$ which implies the Kintchine-Kahane inequality $$\mathrm{Var}(\|V\|)\leq \left(\frac{\pi}{2}-1\right)[E(\|V\|)]^2\leq [E(\|V\|)]^2.$$

Proof. Write $\Sigma=U^TDU$ where $U$ is orthogonal and $D=\mathrm{diag}(a_1, \ldots ,a_n)$ Therefore if $Z\sim N(0,I_n)$ we have $V\sim \Sigma^{1/2}Z\sim D^{1/2}Z$ since $Z\sim UZ.$ and therefore \begin{equation}\label{NORME}\|V\|^2\sim a_1Z_1^2+\cdots+a_nZ_n^2\ \ (B)\end{equation} where $Z_1,\ldots,Z_n$ are independent and $N(0,1).$ This implies that $E(\|V\|^2)=a_1+\cdots+a_n=\mathrm{trace}\, \Sigma.$

Now we embark for a minoration of $E(\|V\|)=E(( a_1Z_1^2+\cdots+a_nZ_n^2)^{1/2})$ For this we use the following integral representation of the square root of the positive number $x:$

\begin{equation}\label{INTEGRAL}\sqrt{x}=\frac{1}{2\sqrt{\pi}}\int_0^{\infty}\frac{1-e^{-sx}}{s^{3/2}}ds \ \ (C)\end{equation} For proving it we observe that (C) is correct for $x=0$ and that the derivatives of both sides coincide. As a consequence of (C) we have \begin{eqnarray*}E(\|V\|)&=&\frac{1}{2\sqrt{\pi}}\int_0^{\infty}(1-E(e^{-s(a_1Z_1^2+\cdots+a_nZ_n^2)})\frac{ds}{s^{3/2}}\\&=&\frac{1}{2\sqrt{\pi}}\int_0^{\infty}\left(1-\prod_{i=1}^nE(e^{-sa_iZ_i^2}\right)\frac{ds}{s^{3/2}}=\frac{1}{2\sqrt{\pi}}\int_0^{\infty}\left(1-\prod_{i=1}^n(1+2s a_i)^{-1/2}\right)\frac{ds}{s^{3/2}}.\end{eqnarray*} Since we want to prove that $$J_n\leq \frac{E(\|V\|)}{\sqrt{E(\|V\|^2)}}=E\left(\frac{\|V\|}{(a_1+\cdots+a_n)^{1/2}}\right),$$ from equality (B) without loss of generality we may assume from now on the homogeneity condition $a_1+\cdots+a_n=1.$ With this condition, the inequality $b_1\cdots b_n\leq (\frac{1}{n}(b_1+\cdots+b_n))^n$ for $b_i>0$ applied to $b_i=1+2sa_i$ we get $\prod_{i=1}^n(1+2s a_i)\leq \left(1+\frac{2s}{n}\right)^n$ leading to (A) . For studying the case of equality in (A) , we use the fact that $b_1\cdots b_n= (\frac{1}{n}(b_1+\cdots+b_n))^n$ if and only if $b_1=\ldots=b_n$. Thus we have equality in (A) if and only if $a_1=\ldots=a_n=1/n.$

To prove that $J_n$ is an increasing sequence consider the function $$f(x)=\frac{1}{2\sqrt{\pi}}\int_0^{\infty}\left(1-\left(1+\frac{s}{x}\right)^{-x}\right)\frac{ds}{s^{3/2}}$$ which satisfies $f(n/2)=J_n.$ Observe that from (C) one has $$\lim_{x\to \infty}f(x)=\frac{1}{2\sqrt{\pi}}\int_0^{\infty}\left(1-e^{-s}\right)\frac{ds}{s^{3/2}}=1.$$

To prove that $f$ is increasing on $(0,\infty)$ enough is to show that for fixed $s$ the function $x\mapsto \left(1+\frac{s}{x}\right)^{-x}$ is decreasing, or that the function $x\mapsto g(x)=x\log \left(1+\frac{s}{x}\right)$ is increasing.This is easy to see since $$g'(x)=\log \left(1+\frac{s}{x}\right)-\frac{s}{s+x}$$ whose sign is the same as the sign of $h(\frac{s}{x})=\left(1+\frac{s}{x}\right)\log \left(1+\frac{s}{x}\right)-\frac{s}{x}.$ Now $h(t)=(1+t)\log(1+t)-t$ is positive for $t>0$ since $h(0)=0$ and $h'(t)=\log(1+t)>0.$

To conclude the proof we get that $$\mathrm{Var}(\|V\|)\leq \left(\frac{1}{J^2_n}-1\right)[E(\|V\|)]^2\leq \left(\frac{1}{J^2_1}-1\right)[E(\|V\|)]^2=\left(\frac{\pi}{2}-1\right)[E(\|V\|)]^2$$ since $J_1\leq J_n$ for all $n$ and since $J_1=E(|Z|)=\sqrt{\frac{2}{\pi}}$ when $Z\sim N(0,1).$ Since $\frac{\pi}{2}-1<1$ we have an improvement of the Kintchine-Kahane inequality, which is in fact optimal.

Of course, all that stuff applies to the Angelo case, which corresponds to $V=\sum g_iX_i$ and $\Sigma=\sum_i X_iX_i^T.$

Related Question