Finding Joint PDF of Two Non-Independent Continuous Random Variables

probability distributionsprobability theorystatistics

I'm in the process of reviewing some stats using A First Course in Probability by Sheldon Ross. For the chapter on Joint Distributions, it shows how to obtain the Joint PDF given two independent continuous random variables. However, if the variables weren't independent, how would I go about obtaining the joint PDF of the two variables? Is there a systematic way of going about it similar to when the variables are independent?

So for example, if $f(x)$ and $f(y)$ is the PDF of two continuous independent random variables, I can find their joint PDF $f_{x,y}(x,y)$ by simply multiplying $f(x)$ and $f(y)$. However, how will I find $f_{x,y}(x,y)$ if $X$ and $Y$ were not independent?

Thanks!

Best Answer

You wouldn't be able to find their joint pdf $f_{X,Y}$ given just their individual pdfs if they are not independent. You would need at least a conditional pdf or the joint pdf itself to know more about the relationship of their distributions. The joint pdf is related to the conditional pdf by $$\begin{split}f_{X|Y}(x|y)&=\frac{f_{X,Y}(x,y)}{f_Y(y)}\\\text{or} f_{Y|X}(y|x)&=\frac{f_{X,Y}(x,y)}{f_X(x)}\end{split}$$

If the variables are independent $$\begin{split}\frac{f_{X,Y}(x,y)}{f_Y(y)}&=f_{X|Y}(x|y)\\ &=f_X(x)\end{split}$$

which is why you can directly multiply them together.

Related Question