[Math] Gamma distribution to normal approximation

approximationgamma distributionnormal distributionstatistics

I know that if there are i.i.d. random variables $X_i \sim N(0,\sigma^2)$ for $i = 1, 2, \cdots, N$, then $Y = X_1^2 + X_2^2 + \cdots + X_N^2$ follows $\Gamma(N/2,2\sigma^2)$.

When $N$ is sufficiently large number, Gamma distribution can be approximated as normal distribution $N(\mu_n,\sigma_n^2)$.
I can find closed form of $\mu_n$ and $\sigma_n^2$ experimentally, but I can not derive the closed form of them mathematically. And also I can not define the sufficient large value $N$.

My question is how to find $\mu_n$ and $\sigma_n^2$ mathematically and how can I define the lower bound of $N$ for approximating gamma to normal distribution? (many references say that proper degree of freedom is 30)

Best Answer

By the central limit theorem, $\sqrt{N}\left(\frac{1}{N}(X_1^2 + \cdots + X_N^2) - 1\right)$ converges in distribution to $N(0, 2)$, since $X_1^2$ has mean $1$ and variance $2$. You can manipulate this to find your $\mu_N$ and $\sigma^2_N$. Regarding how large $N$ should be for a "good" approximation, you would need something like the Berry-Esseen theorem to give a quantitative statement.

Related Question