Stuck on proving a variation of the Central Limit Theorem.

central limit theoremprobabilityprobability theoryprobability-limit-theorems

I'm trying to prove the following version of the central limit theorem.

Let $$L^n = (L_1^n,…L_n^n) $$

such that the $L_i$ are i.i.d.,

there exists a sequence of constants such that $|L_i^n|\leq K^n$ for all $i$ and $K^n \rightarrow{0}$,

and it holds that for $Z_n= \sum_{i=1}^nL_i^n$ we have $\mathbb{E}[Z_n] \rightarrow{\mu}$ and $\operatorname{Var}(Z_n) \rightarrow \sigma^2$.

Then we have that $Z_n$ converges in distribution to $Z$ where $Z$ is normal with mean $\mu$ and variance $\sigma^2$.

I think it should be done with Characteristic functions + Levy's continuity theorem. The characterstic function will factor by the first hypothesis but I don't see how to incorporate the other two to arrive at the desired result.

Best Answer

Yes, you are right, the argument uses Levy's Convergence theorem and is like the proof for CLT in the iid case.

Let $\mu_n = \mathbb{E}\left[L^n_1\right]$ and $\sigma_n^2 = \text{Var}\left(L^n_1\right)$, then by the assumptions on $Z$, $$\mathbb{E}\left[Z_n\right] = n \mu_n \to \mu \quad \text{ and } \quad \text{Var}\left(Z_n\right) = n \sigma_n \to \sigma, \qquad (*)$$ which implies that $\mu_n = O\left(\frac{1}{n}\right)$ and $\sigma^2_n = O\left(\frac{1}{n}\right)$.

In a similar spirit to estimates derived in the proof of the standard CLT (see for example section 18.3 of D. Williams's Probability with Martingales), let $R_2(x) = e^{ix} - \left(1 + ix - \frac{x^2}{2}\right)$ be the remainder of second order Taylor approximation of $e^{ix}$. Additionally, it holds that $\left|R_2(x)\right| \leq \frac{|x|^3}{6}$, see the Williams's book if it isn't clear to you why this is true. Then we have for the characteristic function of $L_1^n - \mu_n$, the following estimate, $$\varphi_{L_1^n - \mu_n}\left(\theta\right) = \mathbb{E}\left[\exp{\left(i\theta \left(L_1^n - \mu_n\right)\right)}\right] = 1 - \frac{\theta^2\sigma_n^2}{2} + \mathbb{E}\left[R_2\left(\theta L_1^n\right)\right] = 1 - \frac{\theta^2\sigma_n^2}{2} + R^n\left(\theta\right),$$ where $\left|R^n\left(\theta\right)\right| \leq \frac{\left|\theta\right|^3}{6}\mathbb{E}\left[\left|L_1^n - \mu_n\right|^3\right]$. Applying Hölder and the bound on $L_1^n$, we have $$\left|R^n\left(\theta\right)\right| \leq \left\|L_1^n - \mu_n\right\|_\infty\mathbb{E}\left[\left(L_1^n - \mu_n\right)^2\right]\leq \frac{\left|\theta\right|^3}{6}\left(K_n + \left|\mu_n\right|\right)\mathbb{E}\left[\left(L_1^n - \mu_n\right)^2\right] = \frac{\left|\theta\right|^3}{6}\left(K_n + \left|\mu_n\right|\right) \sigma_n^2,$$ here $\|\cdot\|_\infty$ is the sup-norm. The above implies that $R^n = o\left(\frac{1}{n}\right)$ as $K_n, \left|\mu_n\right| \to 0$ as $n\to \infty$ and $\sigma^2_n = O\left(\frac{1}{n}\right)$.

Denote by $\varphi_{Z_n}$ the characteristic function of $Z_n$, which is given by $$\varphi_{Z_n}\left(\theta\right) = \mathbb{E}\left[\exp{\left(i\theta Z_n\right)}\right] = \prod_{i = 1}^n\varphi_{L_i^n} = e^{i\theta \mu_n n}\left(\varphi_{L_1^n - \mu_n}\left(\theta\right)\right)^n. $$ Taking logs and applying the asymptotics we derived and $(*)$, we get \begin{gather} \log{\varphi_{Z_n}\left(\theta\right)} = i\theta \mu_n n + n\log{\left(1 - \frac{\theta^2 \sigma_n^2}{2} + R_n\left(\theta\right)\right)} \\ = i\theta \mu_n n - \frac{\theta^2\sigma_n^2 n}{2} + n R_n\left(\theta\right) + O\left(\frac{1}{n}\right) \xrightarrow{n \to \infty} i\theta \mu - \frac{\theta^2\sigma^2}{2}, \end{gather} as $nR_n\left(\theta\right) \to 0$. Finally Levy's Convergence theorem lets us conclude that $Z_n$ converges weakly to $Z \sim \mathcal{N}\left(\mu, \sigma\right)$.