[Math] How Binomial and Normal distributions approximate Poisson distribution respectively

probabilityprobability distributionsreal-analysis

From Wikipedia:

  1. In some cases, the cdf of the Poisson distribution is the limit of
    the cdf of the binomial distribution:

    The Poisson distribution can be derived as a limiting case to the
    binomial distribution $\text{bin}(n,p)$ as the number $n$ of trials
    goes to infinity and the expected number $np$ of successes remains
    fixed — see law of rare events below. Therefore it can be used as an
    approximation of the binomial distribution if $n$ is sufficiently
    large and $p$ is sufficiently small.

  2. In some cases, the cdf of the Poisson distribution is the limit of
    the cdf of the normal distribution:

    For sufficiently large values of $λ$, (say $λ>1000$), the normal
    distribution with mean $λ$ and variance $λ$ (standard deviation
    $\sqrt{\lambda}$), is an excellent approximation to the Poisson
    distribution. If $λ$ is greater than about $10$, then the normal
    distribution is a good approximation if an appropriate continuity
    correction is performed, i.e., $P(X ≤ x)$, where (lower-case) $x$ is a
    non-negative integer, is replaced by $P(X ≤ x + 0.5)$. $$
    F_\mathrm{Poisson}(x;\lambda) \approx F_\mathrm{normal}(x;\mu=\lambda,\sigma^2=\lambda)\, $$

I wonder in what best senses these functional approximations are? Pointwise, uniformly, $L_2$, …?
Thanks and regards!

Best Answer

In comments above, Tim asked why it must be that if $X\sim\mathrm{Poisson}(\lambda)$ and $Y\sim\mathrm{Poisson}(\mu)$ and $X$ and $Y$ are independent, then we must have $X+Y\sim\mathrm{Poisson}(\lambda+\mu)$.

Here's one way to show that. \begin{align} & \Pr(X+Y= w) \\[8pt] = {} & \Pr\Big( (X=0\ \& \ Y=w)\text{ or }(X=1\ \&\ Y=w-1) \\ & {}\qquad\qquad\text{ or } (X=2\ \&\ Y=w-2)\text{ or } \ldots \text{ or }(X=0\ \&\ Y=w)\Big) \\[8pt] = {} & \sum_{u=0}^w \Pr(X=u)\Pr(Y=w-u)\qquad(\text{independence was used here}) \\[8pt] = {} & \sum_{u=0}^w \frac{\lambda^u e^{-\lambda}}{u!} \cdot \frac{\mu^{w-u} e^{-\mu}}{(w-u)!} \\[8pt] = {} & e^{-(\lambda+\mu)} \sum_{u=0}^w \frac{1}{u!(w-u)!} \mu^u\lambda^{w-u} \\[8pt] = {} & \frac{e^{-(\lambda+\mu)}}{w!} \sum_{u=0}^w \frac{w!}{u!(w-u)!} \mu^u\lambda^{w-u} \\[8pt] = {} & \frac{e^{-(\lambda+\mu)}}{w!} (\lambda+\mu)^w \end{align} and that is what was to be shown.