Use of a Joint Moment Generation Function – Do I have this right

probabilityprobability distributions

Below is a problem I did. I am confident that I got part a right. I am not
confident that I got part b right. In particular, I am thinking that my use of
the partial derivative symbol may not be right. I am hoping that somebody can
check my work.
Thanks
Bob
Problem:
Let $(X,Y)$ be a continues bivariate r.v. with joint pdf
\begin{eqnarray*}
f_{XY}(x,y) &=& \begin{cases}
e^{-(x+y)} & x > 0 , y > 0 \\
0 & \text{otherwise} \\
\end{cases} \\
\end{eqnarray*}
(a) Find the joint moment generating function of $X$ and $Y$.
(b) Find the joint moments $m_{10}$, $m_{01}$ and $m_{11}$.
Answer: (a)
\begin{eqnarray*}
M_{XY} &=& E(e^{t_1 X + t_2 Y}) \\
M_{XY} &=&
\int_{0}^{\infty} \int_{0}^{\infty} (e^{t_1 x + t_2 y})e^{-(x+y)}
\, dy \, dx \\
M_{XY} &=&
\int_{0}^{\infty} \int_{0}^{\infty} e^{t_1x – x + t_2y -y} \, dy \, dx \\
M_{XY} &=&\int_{0}^{\infty} \frac{e^{t_1x – x + t_2y -y}}{t_2 – 1} \,
\Big|_{y = 0}^{y = \infty} dx \\
M_{XY} &=&\int_{0}^{\infty} 0 – \frac{e^{t_1x – x}}{t_2 – 1} \, dx \\
M_{XY} &=&\int_{0}^{\infty} \frac{e^{t_1x – x}}{1 – t_2} \, dx \\
M_{XY} &=& \frac{e^{t_1x – x}}{(t_1+ 1)(1 – t_2)} \Big|_{0}^{\infty} \\
M_{XY} &=& 0 – \frac{1}{(t_1 – 1)(1 – t_2)} \\
M_{XY} &=& \frac{1}{ (t_1 – 1)(t_2 – 1) } \\
M_{XY} &=& \frac{1}{(1 – t_1)(1 – t_2)} \\
\end{eqnarray*}
Part (b)
\begin{eqnarray*}
M_{XY} &=& (t_1 – 1 )^{-1} (t_2 – 1)^{-1}\\
m_{10} &=& \frac{\partial}{\partial t_1} M_{XY}(0,0) \\
\frac{\partial}{\partial t_1} M_{XY} &=& -(t_1-1)^{-2}(t_2-1)^{-1} \\
\frac{\partial}{\partial t_1} M_{XY}(0,0) &=& -(0-1)^{-2}(0-1)^{-1 } = -(1)(-1) \\
m_{10} &=& 1 \\
m_{01} &=& \frac{\partial}{\partial t_2} M_{XY}(0,0) \\
\frac{\partial}{\partial t_2} M_{XY} &=& -(t_1-1)^{-1}(t_2-1)^{-2} \\
\frac{\partial}{\partial t_2} M_{XY}(0,0) &=& -(0 – 1)^{-1}(0-1)^{-2 } = -(-1)(1) \\
\frac{\partial}{\partial t_2} M_{XY}(0,0) &=& 1 \\
m_{01} &=& 1 \\
m_{11} &=& \frac{\partial^2}{\partial t_1 \partial t_2} M_{XY}(0,0) \\
\frac{\partial^2}{\partial t_1 \partial t_2} M_{XY} &=&
(t_1-1)^{-2} (t_2 – 1)^{-2} \\
\frac{\partial^2}{\partial t_1 \partial t_2} M_{XY}(0,0 ) &=&
(0-1)^{-2} (0 – 1)^{-2} = 1(1) = 1 \\
m_{11} &=& 1 \\
\end{eqnarray*}

Best Answer

Some feedback:

I haven't looked at your work in extreme detail, but it looks mostly good.

I would prefer that you use $f_{X, Y}$ rather than $f_{XY}$ to make it clear that it is the joint density, rather than the density of the product $XY$.

I know that some probability textbooks use the notation $$\dfrac{\partial}{\partial t_1}M_{X,Y}(0, 0)$$ but I prefer $$\left.\dfrac{\partial}{\partial t_1}M_{X,Y}(t_1, t_2)\right|_{(t_1, t_2) = (0, 0)}$$ because this notation makes it more clear that the $(0, 0)$ needs to be plugged in after the partial derivative is calculated.


Besides the notational stuff, your work does look good. But there's a much shorter way to do this.

Notice that $$f_{X, Y}(x, y) = e^{-x}e^{y} = f_{X}(x)f_{Y}(y)$$ for $x, y > 0$, so by this factorization, it follows that $X$ and $Y$ must be independent exponential random variables with mean $1$.

Thus, it follows that $e^{t_1X}$ and $e^{t_2Y}$ are independent, and you may use the fact that expectation is multiplicative when random variables are independent: $$M_{X, Y}(t_1, t_2) = \mathbb{E}[e^{t_1X+t_2Y}]=\mathbb{E}[e^{t_1X}e^{t_2Y}]=\mathbb{E}[e^{t_1X}]\mathbb{E}[e^{t_2Y}]=\dfrac{1}{(1-t_1)(1-t_2)}$$ (assuming you already know the MGF of an exponential distribution) for $t_1, t_2$ chosen appropriately (which I will leave for you to find).

It follows easily that $m_{10} = m_{01} = 1$, and since we've established that $X$ and $Y$ are independent, $$m_{11} = \mathbb{E}[XY]=\mathbb{E}[X]\mathbb{E}[Y]=1 \cdot 1 = 1\text{.}$$

Related Question