[Math] Proof of the Central Limit Theorem using moment generating functions

central limit theoremlimitsmoment-generating-functionsstatistics

Below is a method of proving the Central Limit Theorem using moment generating functions.

Let $$X_{1},X_{2},…,X_{n}$$ be a sequence of i.i.d. random variables with expected value and variance $$E(X_{i}) = \mu < \infty, Var(X_{i})=\sigma ^{2}< \infty.$$

Now let

$$Z_{n}=\frac{\overline{X}-\mu }{\frac{\sigma }{\sqrt{n}}} = \frac{X_{1}+X_{2}+…+X_{n}-n\mu }{\sigma \sqrt{n}}.$$

We want to show that

$$\lim_{n \to \infty} M_{Z_{n}}(t)=e^{\frac{t^{2}}{2}}$$

where $M_{X}(t)$ is the moment generating function over some finite interval. In order to prove this, we can define a new random variable, $Y_{i}$, which is the normalized version of $X_{i}$. Thus,

$$Y_{i}=\frac{X_{i}-\mu }{\sigma }.$$
Then, we can say that $Y_{i}$ is i.i.d. with expected value and variance

$$E(X_{i}) =0, Var(X_{i})=1.$$

Using this information, we have $$Z_{n}=\frac{\overline{Y}-\mu }{\frac{\sigma }{\sqrt{n}}} = \frac{Y_{1}+Y_{2}+…+Y_{n} }{\sqrt{n}}.$$

Finding the moment generating function gives

$$M_{Z_{n}}(t)=E[e^{t\frac{Y_{1}+Y_{2}+…+Y_{n} }{\sqrt{n}}}] =E[e^{t\frac{Y_{1}}{\sqrt{n}}}]\cdot E[e^{t\frac{Y_{2}}{\sqrt{n}}}]\cdot …\cdot E[e^{t\frac{Y_{n}}{\sqrt{n}}}]= M_{Y_{1}}(\frac{t}{\sqrt{n}})^{n}.$$

Lastly,

$$\lim_{n \to \infty} M_{Z_{n}}(t)=\lim_{n \to \infty} M_{Y_{1}}(\frac{t}{\sqrt{n}})^{n} = e^{\frac{t^{2}}{2}}.$$

This concludes the proof. However, how does one show analytically that this final limit does indeed equal

$$e^{\frac{t^{2}}{2}}?$$

Best Answer

By Taylor's theorem, $$M_{Y_1}(s) = E[\exp(sY_1)] = 1 + s E[Y_1] + \frac{s^2}{2} E[Y_1^2] + s^2 h(s) = 1 + \frac{s^2}{2} + s^2 h(s), \qquad \text{where $h(s) \to 0$ as $s \to 0$},$$ where the last step uses $E[Y_1]=0$ and $\text{Var}(Y_1) = 1$.

Thus $$M_{Y_1}(t/\sqrt{n})^n = \left(1 + \frac{t^2/2}{n} + \frac{t^2}{n} h(t^2/n)\right)^n \to e^{t^2/2}.$$ [The expression in parentheses is asymptotically equivalent to $1+\frac{t^2/2}{n}$, so the last step follows by recalling $(1+\frac{x}{n})^n \to e^x$.]


Response to comment:

For $x$ near $0$, we have $\log(1+x) = x(1+g(x))$ where $g(x) \to 0$ as $x \to 0$.

\begin{align} &\lim_{n \to \infty} n \log M_{Y_1}(t/\sqrt{n}) \\ &= \lim_{n \to \infty} n(1-M_{Y_1}(t/\sqrt{n}))[1+g(1-M_{Y_1}(t/\sqrt{n}))] \\ &= \lim_{n \to \infty} \left\{n\left(\frac{t^2/2}{n} + \frac{t^2}{n}h(t^2/n)\right) \left[1+g\left(\frac{t^2/2}{n} + \frac{t^2}{n}h(t^2/n)\right)\right]\right\} \\ &= \lim_{n \to \infty} n\left(\frac{t^2/2}{n} + \frac{t^2}{n}h(t^2/n)\right) \cdot \lim_{n \to \infty} \left[1+g\left(\frac{t^2/2}{n} + \frac{t^2}{n}h(t^2/n)\right)\right] \\ &= \lim_{n \to \infty} n\left(\frac{t^2/2}{n} + \frac{t^2}{n}h(t^2/n)\right) \\ &= \frac{t^2}{2} + t^2 \lim_{n \to \infty}h(t^2/n) \\ &= \frac{t^2}{2}. \end{align}