[Math] Finding a joint probability density function given marginal probability density functions

probability distributions

X and Y are independent random variables with the following PDFs:

$f_{x}(x) = \begin{cases}
(1/3)e^{-x/3}, & x \geq\text{ 0} \\
0, & \text{otherwise}
\end{cases}$

$f_{y}(y) = \begin{cases}
(1/2)e^{-y/2}, & y \geq\text{ 0} \\
0, & \text{otherwise}
\end{cases}$

a) What is $P[X > Y]$?
b) What is $E[XY]$?
c) What is $Cov[XY]$?

Attempt at Solution:
I know the formulas by which to find both the expected value and covariance for $XY$, but the problem lies in finding the joint PDF of X and Y. I understand that to find the marginal PDFs, which have been given, the integral from negative infinity to positive infinity of the joint PDF with respect to the opposite variable must be taken. However, this is the opposite case. My initial instinct was to take the derivative of each marginal PDF with respect to the opposite variable, but in order for that to work, both derivatives would have to be equivalent, which is not the case.

Other than this issue, I am not sure how to solve for $P[X > Y]$.

Best Answer

Hints:

$P\left\{ X>Y\right\} =\int f_{Y}\left(y\right)P\left\{ X>Y\mid Y=y\right\} dy=\int f_{Y}\left(y\right)P\left\{ X>y\right\} dy$

Alternative:

$\int_{0}^{\infty}\int_{y}^{\infty}f_{\left(X,Y\right)}\left(x,y\right)dxdy=\int_{0}^{\infty}\int_{y}^{\infty}f_{X}\left(x\right)f_{Y}\left(y\right)dxdy=\int_{0}^{\infty}f_{Y}\left(y\right)\int_{y}^{\infty}f_{X}\left(x\right)dxdy$

Note that integrand $\int_{y}^{\infty}f_{X}\left(x\right)dx$ can be recognized as: $P\left\{ X>y\right\} $

Since $X$ and $Y$ are independent the joint PDF of $(X,Y)$ is the product of the PDFs of $X$ and $Y$.

Related Question