Computing the joint moment generating function of two random variables

moment-generating-functionsprobabilityprobability distributionsprobability theory

Let $P(N = k) = (1 – p)^{k – 1}p$, where $k = 1, 2, 3, \ldots$ and $0
< p < 1$
. Let $X_{1}, X_{2}, X_{3}, \ldots$ be a sequence of i.i.d
random variables with a common pdf

$$f(x) = \begin{cases} \lambda e^{-\lambda x}, & \text{ if } x > 0 \\
0, & \text { otherwise.} \end{cases} $$

Assume that $N, X_{1}, X_{2}, \ldots$ are independent. Define $Y_{N} =
X_{1} + X_{2} + \ldots X_{N}$
. Compute the joint moment generating
function of $(Y_{N}, N)$.

So I have

$$M(t) = \mathbb{E}[e^{tY_{N} + sN}].$$

By conditioning,

$$\mathbb{E}[e^{tY_{N} + sN}] = \mathbb{E}[\mathbb{E}[e^{tY_{N}+sN}]\mid N] $$

$$= \mathbb{E}[e^{sN} \cdot \mathbb{E}[e^{tY_{N}} \mid N]]. $$

Now compute $\mathbb{E}[e^{tY_{N}} \mid N]$ separately. This is the individual moment generating function of $Y_{N}$, which is the sum of exponential rando m variables. The moment generating function of a single exponential random variable with parameter $\lambda$ is given by

$$M_{X_{1}}(t) = \frac{\lambda}{\lambda – t}$$

for $t < \lambda$. Then by indepenednce,

$$M_{Y_{N}}(t) = \left(\frac{\lambda}{\lambda – t}\right)^{N}.$$

So,

$$\mathbb{E}[e^{sN} \cdot \mathbb{E}[e^{tY_{N}} \mid N]] = \mathbb{E}\left[e^{sN} \cdot \left(\frac{\lambda}{\lambda – t}\right)^{N}\right].$$

Now, how can I compute this value? Or am I approaching this incorrectly?

Best Answer

You're basically there. For notational convenience, define $F(s,\lambda,t)=\lambda e^{s}/(\lambda -t)$. Then the quantity to compute is $E(F^N)=\sum_k F^kP(N=k)=(p/(1-p))\sum_{k>0}F^k(1-p)^{k}$ which can be evaluated using the formula for the sum of a geometric series.

Related Question