[Math] Why does the Central Limit Theorem break down for the Poisson Process

probability theorystochastic-processes

I am considering this in the sense that I know according to the central limit theorem, for an i.i.d. process $X_n$ (with mean $m$ and variance $σ^2$), the corresponding normalized sum process is:
$$
Z_n = \frac{S_n-nm}{σ\sqrt{n}}
$$
with $S_n = X_1+X_2+ . . . + X_n$. I know that this does indeed converge in distribution to a zero-mean unit-variance Gaussian. My question is why does this not happen for the Poisson Process. I am speaking of the Poisson process derived as a limit of the Binomial counting process, where $n$, the number of infinitesimal intervals, went to $∞$, and $p$, the success probability, went to $0$, while their product $np$ stayed constant at $\lambda t$. I believe if CLT had worked here, we would have obtained a Gaussian $N(t)$ instead of a discrete $N(t)$.

For the purposes of exploring this problem, if we consider the Taylor expansion:
$$E\left[\exp\left(-\frac{j \omega}{\sigma \sqrt{n}}(X_1-m)\right)\right]=\displaystyle\sum\limits_{k=0}^∞ \frac{1}{k!}\left(-\frac{j \omega}{\sigma \sqrt{n}}\right)^k E \left[(X_1-m)^k\right]$$

I desire to examine higher-order the terms when $X_1$ is
$$\mathrm{Bernoulli}\left(\frac{\lambda t}{n}\right)$$
as in the Poisson Process. Can we say these term are really neglible when compared to the terms for $k=0,1,2$?

Any help would be greatly appreciated.
Thanks!

Best Answer

The frame of the CLT is that one considers sums of i.i.d. random variables with fixed distribution. When the number $n$ of summands becomes large, after centering (by the mean) and scaling (by $1/\sqrt{n}$), the limit in distribution is normal. The frame described in the revised version of the question is quite different, since one considers binomial distributions $B(n,p_n)$ when $n$ becomes large (just like before) and $p_n$ becomes small (quite different from before), and moreover, when $np_n\to\lambda$ for some positive $\lambda$.

In other words, now the distribution of the individual summands changes with $n$. No wonder the results are different! In the latter case, the limit in distribution (without centering nor scaling) is Poisson (with parameter $\lambda$), a result which is sometimes called the law of rare events.

Edit: The new new version of the question concerns the generating function of the random variable $\xi_n=(S_n-m_n)/\sigma_n$ where $S_n$ is $B(n,p_n)$, $m_n=\mathbb E(S_n)$ and $\sigma_n^2=\mathrm{var}(S_n)$. Since the presentation of this context in the question seems to be rather confused, it may be useful to recall that $m_n=np_n$ and $\sigma_n^2=np_n(1-p_n)$ hence, in the asymptotics considered here, $m_n\to\lambda$ and $\sigma_n^2\to\lambda$. In particular, neither $m_n$ nor $\sigma_n^2$ is linear in $n$. Keeping this in mind, you might try to expand anew the generating function of $\xi_n$.

Related Question