[Math] Joint Moment Generating Function from Conditional and Marginal Distribution

conditional-expectationmoment-generating-functionsprobabilityprobability distributionsrandom variables

Suppose that that random variable $N$ follows a Poisson distribution with mean $\lambda=6$. Suppose that the conditional distribution of the random variable $X$, given that $N=n$, follows that of a $Binomial(n,0.6)$.

Find the Joint Moment Generating Function of $(N, X)$.

Initially I just tried to use the definition. I found the joint PMF using the definition of the conditional distribution, but then I have to sum over both of them in order to find the joint MGF, and this was the step I was stuck at because trying to do a double sum over the product of the binomial and Poisson PMFs doesn't exactly go over very nicely.

Since this problem does NOT assume independence, I can't exactly attempt to use that to my advantage either… so now I'm stuck. How can I calculate this joint MGF?

Best Answer

The summations are doable. The joint pmf is $$f_{X, N}(k, n) = \operatorname P(X = k \mid N = n) \operatorname P(N = n) = \binom n k p^k (1 - p)^{n - k} e^{-\lambda} \frac {\lambda^n} {n!}.$$ Sum over $k$ first: $$\sum_{k = 0}^n e^{s k + t n} f_{X, N}(k, n) = e^{-\lambda} \frac {\lambda^n} {n!} e^{t n} \sum_{k = 0}^n \binom n k (p e^s)^k (1 - p)^{n - k} = \\ e^{-\lambda} \frac {\lambda^n} {n!} e^{t n} (p e^s + 1 - p)^n.$$ Then sum over $n$: $$\operatorname E\left[ e^{s X + t N} \right] = \sum_{n=0}^\infty \sum_{k = 0}^n e^{s k + t n} f_{X, N}(k, n) = \\ e^{-\lambda} \sum_{n=0}^\infty \frac {(\lambda e^t (p e^s + 1 - p))^n} {n!} = \\ \exp( -\lambda + \lambda e^t (p e^s + 1 - p)).$$

Related Question