Poisson Distribution – How Negative Binomial MGF Converges to Poisson MGF

convergencemoment-generating-functionnegative-binomial-distributionpoisson distributionself-study

This question is Exercise 3.15 in Statistical Inference by Casella and Berger. It asks to prove that the MGF of a Negative Binomial $\mathscr{N}eg(r,p)$ converges to the MGF of a Poisson $\mathscr{P}(\lambda)$ distribution, when
$$r\to\infty\,, \quad p\to 1\,, \quad r(1-p)\to\lambda$$

The formula I have for the MGF of $X\sim \mathscr{N}eg(r,p)$ is:

$$M_X(t) = \frac{p^r}{[1-e^t(1-p)]^r}$$

Considering just the denominator, we have
$$[1-e^t(1-p)]^r = [1+\frac{1}{r}e^tr(p-1)]^r = [1+\frac{1}{r}e^t(-\lambda)]^r$$ As $r\to\infty$, this converges to $e^{-\lambda e^t}$. Now considering the entire formula again, and letting $r\to\infty$ and $p\to 1$, we get $e^{\lambda e^t}$, which is incorrect since the MGF of Poisson($\lambda$) is $e^{\lambda(e^t-1)}$. I seem to be on the right track, just made a misstep somewhere. Can anyone spot my mistake?

Best Answer

You make a mistake by ignoring $p^r$: If you consider your MGF $$M_X(t) = \frac{p^r}{[1-e^t(1-p)]^r}\,,$$ then $$\log\{M_X(t)\} = r\log(p)-r\log\{1-e^t(1-p)\}$$ and using the asymptotic equivalences \begin{align*} r\log(p)-r\log\{1-e^t(1-p)\} &= r\log(1-[1-p])-r\log\{1-e^t(1-p)\}\\ &\approx -r[1-p]+re^t(1-p)\\ &\approx \lambda[-1+e^t] \end{align*} which shows that the limiting value of the MGF is $$ \exp\{\lambda[e^t-1]\}$$ as requested in this exercise.

Note: There are two versions of the MGF for a š¯’©eg(n,p) distribution, one for the number of trials and one for the number of failures. The current version is the MGF for the number of failures, which starts at zero like the Poisson distribution.

Related Question