[Math] Chi-square with $n$ degrees of freedom,Normal distribution

chi squaredmoment-generating-functionsprobabilitystatistical-inferencestatistics

How can I see that the $\chi^2(n)$ random variable has the moment generating function $$(1-2t)^{-n/2}$$?

I would also like to know why precisely that if $Z \sim N(0,1)$ then $Z^2 \sim \chi^2(1)$.

And if $X_i \sim \chi^2(1)$ and the $X_i$ are independent then $\sum_{i=1}^n X_i \sim \chi^2(n)$.

Are these two last lines just the definition of $\chi^2$?

Best Answer

Starting with your last questions first: yes, the $\chi^2$ distribution with $k$ degrees of freedom is normally defiined as being the sum of the squares of $k$ independent $N(0,1)$ distributions.

For the moment generating function, note that since the MGF of a sum of independent variables, is the product of the MGFs, if $M_k$ denotes the moment generating function of $\chi^2(k)$ then

$$M_k(s) = M_1(s)^k.$$

Then to derive $M_1(s)$, denoting $X = N(0,1)^2$ and $f$ for the pdf of a $N(0,1)$ variable

\begin{align*} M_1(s) & = \mathbf E \left[ e^{sX^2} \right] \\ & = \int_{-\infty}^\infty \exp(sx^2) f(x) dx \\ & = \frac{1}{\sqrt{2\pi}} \int_{-\infty}^\infty \exp\left(sx^2\right) \exp\left(-x^2/2\right) dx\\ & = \frac{1}{\sqrt{2\pi}} \int_{-\infty}^\infty \exp\left((s-1/2)x^2\right) dx\\ & = \frac{1}{\sqrt{2\pi}} \int_{-\infty}^\infty \exp\left(-\frac{x^2}{2 (1-2s)^{-1}}\right) dx\\ & = \frac{\sqrt{2 \pi (1-2s)^{-1}}}{\sqrt{2 \pi} }\\ & = \left(1 - 2s\right)^{-\frac12} \end{align*} where in the second to last line we used the fact that the integral is the un-normalized pdf of a $N\left(0, (1-2s)^{-1} \right)$ distribution.

From the above we then get $$ M_k(s) = M_1(s)^k = (1 - 2s)^{-k/2}.$$


Why is the MGF of a sum of independent variables the product of their MGFs?

This follows from the fact that given independent random variables $X,Y$ and functions $U,V$ then $$ \mathbf E [ U(X) V(Y)] = \mathbf E[U(X)] \mathbf E[V(Y)]$$ In the special case that $U_s(x) = V_s(x) = \exp(sx)$ then $$U_s(X)V_s(Y) = \exp(sX)\exp(sY) = \exp(s(X+Y)),$$ so the result about the MGFs follows.

To justify the general claim, we note that the joint distribution function of two independent variables satisfies $f_{X,Y}(x,y) = f_X(x)f_Y(y)$ and then

\begin{align*} E [ U(X) V(Y)] &= \int_{-\infty}^\infty \int_{-\infty}^\infty U(x) V(y) f_{X,Y}(x,y) dx dy \\ & = \int_{-\infty}^\infty \int_{-\infty}^\infty U(x) f_X(x) V(y)f_Y(y) dx dy \\ & = \int_{-\infty}^\infty U(x)f_X(x) \left(\int_{-\infty}^\infty V(y)f_Y(y)dy \right) dx \\ & = \int_{-\infty}^\infty U(x)f_X(x) \mathbf E[V(Y)] dx\\ & = \mathbf{E}[V(Y)] \int_{-\infty}^\infty U(x)f_X(x) dx\\ & = \mathbf E[V(Y)] \mathbf E[V(X)]\\ \end{align*}