Distributions – Is the Poisson Distribution Stable and Are There Inversion Formulas for the Moment Generating Function?

distributionsmoment-generating-functionpoisson distribution

First, I have a question about whether the Poisson distribution is "stable" or not. Very naively (and I'm not too sure about "stable" distributions), I worked out the distribution of a linear combination of Poisson distributed R.V.'s, using the product of the MGF. It looks like I get another Poisson, with parameter equal to the linear combination of the parameters of the individual R.V.'s. So I conclude that Poisson is "stable". What am I missing?

Second, are there inversion formulas for the MGF like there are for the characteristic function?

Best Answer

Linear combinations of Poisson random variables

As you've calculated, the moment-generating function of the Poisson distribution with rate $\lambda$ is $$ m_X(t) = \mathbb E e^{t X} = e^{\lambda (e^t - 1)} \>. $$

Now, let's focus on a linear combination of independent Poisson random variables $X$ and $Y$. Let $Z = a X + b Y$. Then, $$ m_Z(t) = \mathbb Ee^{tZ} = \mathbb E e^{t (a X + b Y)} = \mathbb E e^{t(aX)} \mathbb E e^{t (bY)} = m_X(at) m_Y(bt) \>. $$

So, if $X$ has rate $\lambda_x$ and $Y$ has rate $\lambda_y$, we get $$ m_Z(t) = \exp({\lambda_x (e^{at} - 1)}) \exp({\lambda_y (e^{bt} - 1)}) = \exp(\lambda_x e^{at} + \lambda_y e^{bt} - (\lambda_x + \lambda_y))\>, $$ and this cannot, in general, be written in the form $\exp(\lambda(e^t - 1))$ for some $\lambda$ unless $a = b = 1$.

Inversion of moment-generating functions

If the moment generating function exists in a neighborhood of zero, then it also exists as a complex-valued function in an infinite strip around zero. This allows inversion by contour integration to come into play in many cases. Indeed, the Laplace transform $\mathcal L(s) = \mathbb E e^{-s T}$ of a nonnegative random variable $T$ is a common tool in stochastic-process theory, particularly for analyzing stopping times. Note that $\mathcal L(s) = m_T(-s)$ for real valued $s$. You should prove as an exercise that the Laplace transform always exists for $s \geq 0$ for nonnegative random variables.

Inversion can then be accomplished either via the Bromwich integral or the Post inversion formula. A probabilistic interpretation of the latter can be found as an exercise in several classical probability texts.

Though not directly related, you may be interested in the following note as well.

J. H. Curtiss (1942), A note on the theory of moment generating functions, Ann. Math. Stat., vol. 13, no. 4, pp. 430–433.

The associated theory is more commonly developed for characteristic functions since these are fully general: They exist for all distributions without support or moment restrictions.

Related Question