MGF – MGF of a Binomial-Exponential Compound Distribution

moment-generating-functionsprobabilityrandom walkstochastic-processes

I have the following compound binomial-exponential distribution:

$Z \sim \text{Binom}(N,p)$

$p = (1 – e^{-\lambda_{1}Y})$

$Y \sim \text{Exp}(\lambda_0)$

$N=1$

I am attempting to follow the derivation here to derive the Moment Generating Function (MGF) of this rather nasty distribution in order to derive the PDF. However, I am confused as to exactly how I determine the MGF with my additional parameter $p$.

Attempt at a solution:

I know that, for this distribution, the expectation value should be:

$E(E(Z|Y)) = \frac{N \lambda_1}{\lambda_{0} + \lambda_{1}}$

which I can use to check the solution.

From the same solution as linked, it is stated (paraphrasing):

'the Moment Generating Function (MGF) for the compound Probability Density Function (PDF), Z, is the Probability Generating Function (PGF) of the Binomial distribution evaluated at the MGF for Exponential Distribution'

the PGF of the binomial distribution is given by (see link):

$G(z)=\sum_{i=0}^{N}\left(\begin{array}{c}
N \\
i
\end{array}\right)(z p)^{i} q^{N-i}=(z p+q)^{N}$

The moment generating function of an exponential distribution is (see link):

$f(t)=\int_{0}^{\infty} d x \lambda e^{(t-\lambda_{0}) x}=\frac{\lambda_{0}}{\lambda_{0}-t}$

Now, I am unsure how I bring $p$ into the picture, in particular whether I substitute in this case for $p$ and $q$. I am also not sure in this case what $z$ should be.

Can anyone offer a hint/guidance as to how to proceed?

Attempt at a Solution:

The moment generating function of this distribution is given, as stated below, by conditioning on $Y=y$. Lets rename the constants, $\lambda_{0}=a$ and $\lambda_{1}=b$, for the sake of ease of reading:

$\begin{array}{l}
M(t)=\int a e^{-a y}\left(\left(1-e^{-b y}\right) e^{t}+e^{-b y}\right) d y= \\
a\left(\frac{\left(e^{t}-1\right) e^{y(-(a+b))}}{a+b}-\frac{e^{t-a y}}{a}\right) \end{array}$

We now have to evaluate this integral at $\infty$ and zero.

First at infinity:

$I_{\infty} = a\left(\frac{\left(e^{t}-1\right) e^{-\inf((a+b))}}{a+b}-\frac{e^{t-a\inf}}{a}\right) \rightarrow 0 $

Then at zero:

$I_{0} = a\left(\frac{\left(e^{t}-1\right)}{a+b}-\frac{e^{t}}{a}\right)$

So:

$M(t) = I_{\infty} – I_{0} = a\left(\frac{e^{t}}{a} – \frac{\left(e^{t}-1\right)}{a+b}\right)$

However, now I attempt to verify with what I know:

$\frac{\partial M(t)}{\partial t}\rvert_{t=0} = \mu$

$\frac{\partial M(t)}{\partial t} = \frac{be^{t}}{b+a}$

which recovers the expected mean:

$\mu = \frac{b}{b+a}$

Best Answer

maybe we start with untangling some confusion.

The probability generating function is your function $G(z)$ and it is actually defined via expectation: $\mathbb E[z^X]$. However, what you are looking for is the moment generating function, which is similarly defined as follows: $$M_X(t):=\mathbb E[exp(Xt)]=\int_{\mathbb R}exp(xt)f_X(t)dt$$

We can indeed use the tower property that you mentioned.

We first note: $$M_Z(t)=(1-p+p\cdot exp(t))^N$$ if $p$ where NOT random. How do we get rid of the randomness? That is doable via tower property:

$$M_Z(t)=\mathbb E[exp(Zt)]=\mathbb E[\mathbb E[exp(Zt)|Y]]$$

To compute that look at the quantity: $\mathbb E[exp(Zt)|Y=y]$. This has the form as mentioned above, but with p defined as given. Now this is a function $g(y)$ of y, a number for each fixed y. To compute the outer expectation with respect to the given exponential distribution, just integrate with respect to the exponential density: $$M_Z(t)=\int_{\mathbb R^+}g(y)\lambda_0 \exp(-\lambda_0 y)dy.$$

If you need more explicit help, feel free to ask. Hope this helps without spoiling the fun ;)

EDIT: I just add my computation for the moment generating function for $N=1$. We keep in mind, how the exponential distribution is defined and we will utilize that to keep us from computing any integrals explicitly.

$$M_Z(t)=\mathbb E_Y[\left(1+(1-e^{-\lambda_1 Y})(e^t-1)\right)]$$ $$=\int_0^\infty\left(1+(1-e^{-\lambda_1 y})(e^t-1)\right)\lambda_0e^{-\lambda_0 y}dy$$ $$=\int_0^\infty\lambda_0e^{-\lambda_0y}dy+(e^t-1)\int_0^\infty(1-e^{-\lambda_1y})\lambda_0e^{-\lambda_0 y}dy$$ $$=1+(e^t-1)\left(1-\frac{\lambda_0}{\lambda_0+\lambda_1}\cdot\int_0^\infty(\lambda_0+\lambda_1)e^{-(\lambda_0+\lambda_1) y}dy\right)$$ $$=1+(e^t-1)\left(1-\frac{\lambda_0}{\lambda_0+\lambda_1}\right) =1+(e^t-1)\cdot\frac{\lambda_1}{\lambda_0+\lambda_1}$$ $$=\frac{\lambda_0}{\lambda_0+\lambda_1}+e^t\frac{\lambda_1}{\lambda_0+\lambda_1}=q+e^t(1-q)$$

However using the binomial theorem for other values of $N$, will get us to $$M_Z(t)=(q+(1-q)e^t)^N$$ which just shows, that the $Z$ is binomial again, i.e. $Z\sim Bin(N,1-q)$. In other words, randomizing the success parameter of a binomial distribution with an exponential parameter does not change the type of distribution.

This explains also, why the inverse Fourier-transform yields something constant. The random variable $Z$ is just discrete and thus has no (absolutely continuous) density (with respect to Lebesgue-measure).

Related Question