Probability – Finding an Upper Bound on the CDF of Sum of Squared Gaussian Random Variables

normal distributionprobabilityprobability distributionsupper-lower-bounds

Let $x_1, \dots, x_N$ be i.i.d. standard normal random variables. Define $y_n = \sum_{i=1}^n x_n$ for $n = 1, \dots, N$, which has a Gaussian distribution with mean zero and variance $n$. Note that $y_1, …, y_N$ are not independent. Consider the following random variable $Z = \sum_{n=1}^N y_n^2$. Is it possible to find an upperbound on the CDF of $Z$ around zero which tends to zero as $N \rightarrow \infty$? Specifically, for $\epsilon_N>0$ that $\lim_{N \rightarrow \infty} \epsilon_N = 0$ find $U_N$ such that

$$p(Z \leq \epsilon_N) \leq U_N,$$

where $\lim_{N \rightarrow \infty} U_N = 0$.

My intuition is as $N$ increases, the variance of $Z$ increases, thus the probability that $Z$ is close to zero should decrease. Also, if we were looking at $V = \sum_{n=1}^N x_n^2$ instead, $V$ would have a Chi-squared distribution and we could have come up with a bound.

Best Answer

Very rough estimate: $Z = \sum_{n=1}^N y_n^2 \geq y_N^2$, therefore $$ \mathbb P(Z \leq \epsilon_N) \leq \mathbb P(y_N^2 \leq \epsilon_N) = \mathbb P(Nx_1^2 \leq \epsilon_N) $$ The last equality follows from $y_N\sim \mathcal N(0,N)$. Indeed, $$y_N/\sqrt{N}\sim \mathcal N(0,1)$$ and $y_N^2$ distributed as $Nx_1^2$. So $$ \mathbb P(Z \leq \epsilon_N) \leq \mathbb P\left(x_1^2 \leq \frac{\epsilon_N}{N}\right)=1-2\Phi\left(-\sqrt{\frac{\epsilon_N}{N}}\right)=U_N $$ which tends to zero as soon as $\frac{\epsilon_N}{N}$ tends to zero.

Related Question