[Math] Central Limit Theorem Proof using Characteristic Functions

central limit theoremprobabilityprobability theoryproof-verificationstatistics

I have my proof of the Central Limit Theorem using characteristic functions in which the main steps are:

Set up: Let $X_1, X_2, \dots, X_n$ be a sequence of iid random variables with mean $\mu$ and variance $\sigma^2$. Let $Y_i=\frac{X_i-\mu}{\sigma}$ and $S_n=\sum_{i=1}^{n}\frac{1}{\sqrt{n}}Y_i$.

1) Expand the characteristic function $\psi_{S_n}(t)=\psi_{Y}\left(\frac{t}{\sqrt{n}}\right)^n$

2)Taylor expand the characteristic function:
$$
\psi_{S_n}=(1+\frac{it}{\sqrt{n}}\mathbb{E}(Y)-\frac{t^2}{2n}\mathbb{E}(Y^2)+\dots)^n
$$
Which we can then truncate with a bounded function $H(t)$:
$$
\psi_{S_n}=(1+\frac{it}{\sqrt{n}}\mathbb{E}(Y)-\frac{t^2}{2n}\mathbb{E}(Y^2)+n^{-\frac{3}{2}}H(t))^n
$$
3) Using $\mathbb{E}(Y)=0$ and $\mathbb{V}(Y)=1$ we get:
$$
\psi_{S_n}=(1+-\frac{t^2}{2n}+n^{-\frac{3}{2}}H(t))^n
$$
Which if we take logs and the limit $n\to\infty$ we can show that, using
$$
\frac{\log(1+x)}{x}\to1
$$
As $x\to 0$, we get $\log(\psi_{S_n})\to-\frac{t^2}{2}$ and then it exponate each term we recover that $\psi_{S_n}$ tends to the characteristic function of a $N(0,1)$ distribution.

My question is, how do we know that $H(t)$ is bounded and will tend to zero as $n\to\infty$? I thought it had something to do with the moments of a $N(0,1)$ distribution being finite since we have $\mathbb{E}(Y)=0$ and $\mathbb{V}(Y)=1$ but I am not sure this is correct. I have tried to read papers online, but they usually just use little-o notation for this part of the proof, and don't explain how they know that we can essentially disregard this term. Thanks for any help.

Best Answer

I'm not sure that there is such an $H(t)$ in general (if $E(|X|^3)<\infty$ maybe). What is the case is that as long as $X$ has a variance then $$\phi_X(t)=1+i\mu t-\frac{\sigma^2}{2}t^2+h(t)$$ where $h(0)=0$ (obviously) and $h(t)/t^2\to0$ as $t\to0$.

From now on, I'll take $\mu=0$ and $\sigma=1$ without any real loss of generality.

Fix $t\ne0$. Then $$\phi_{S_n}(t)=\left(1-\frac{t^2}{2n}+h(n^{-1/2}t)\right)^n.$$ For large enough $n$ then $$\ln\phi_{S_n}(t)=n\ln\left(1-\frac{t^2}{2n}+h(n^{-1/2}t)\right) =n\left(-\frac{t^2}{2n}+h(n^{-1/2}t)\right) +n\Psi\left(-\frac{t^2}{2n}+h(n^{-1/2}t)\right)$$ where we define $$\Psi(x)=\ln(1+x)-x.$$ Now $nh(n^{-1/2}t)=t^2h(n^{-1/2}t)/(n^{-1/2})^2\to0$ as $n\to\infty$. Also $\Psi(x)/x^2\to-1/2$ as $x\to0$. If we take $x=-t^2/(2n)+h(n^{-1/2}t)$ we see that $n\Psi(x)\to0$ as $n\to\infty$ (after a little effort).

Now where does $h$ come from? Well, $$h(t)=E\left(e^{itX}-1-itX+\frac{t^2X^2}2\right)=E(\eta(tX))$$ where $$\eta(u)=e^{iu}-1-iu+\frac{u^2}2=\sum_{k=3}^\infty\frac{(iu)^k}{k!}.$$ Then $\eta(u)/u^2$ is bounded and tends to $0$ as $u\to0$. Then $$E(\eta(tX))=t^2E\left(\frac{\eta(tX)}{t^2}\right).$$ Now $$E\left(\frac{\eta(tX)}{t^2}\right)\to0$$ as $t\to0$ (I think by dominated convergence).

Related Question