[Math] Find the conditional probability density function $f_{y|x}(y,x)$

probabilityprobability distributionsprobability theory

Assuming $Z$ random variable with $Z=X+Y$, $Z$ depending on $X$ but is independent of $Y$.
We know the value of $X$.
Also assume we know the joint pdf $f_{x,z}(x,z)$. Find $f_{y|x}(y,x)$.

Can you give me a hint or solution?
I think we should apply Bayes
$$f_{Y|X}(x,y)=\frac{f_{Y,X}(y,x)}{f_Y(y)}f_{X,Z}(x,z)=f_{X,X+Y}(x,x+y)f_X(x).$$

Best Answer

The conditional distribution of $Y$ given $X$ is $$f_{Y|X}(x,y)=\frac{f_{X,Y}(x,y)}{f_X(x)}.$$

You should be able to find the marginal distribution of $X$ by integrating the joint distribution of $X$ and $Z$: $$f_X(x)=\int_{-\infty}^\infty f_{X,Z}(x,z)dz.$$

Also, as noted below in the comment by Graham, $f_{X,Y}(x,y)=f_{X,Z}(x,x+y)$. So $$f_{Y|X}(x,y)=\frac{f_{X,Y}(x,y)}{f_X(x)}=\frac{f_{X,Z}(x,x+y)}{f_X(x)}.$$

In general, consider bivariate functions of random variables: $Z_1=h_1(X,Y)$ and $Z_2=h_2(X,Y)$, and assume we know the joint p.d.f. of $X$ and $Y$, $f_{X,Y}(x,y)$. As long as we can find an inverse transform $X=u(Z_1,Z_2)$ and $Y=v(Z_1,Z_2)$, then the joint p.d.f. of $Z_1$ and $Z_2$ is $$f_{Z_1,Z_2}(z_1,z_2)=f_{X,Y}\left(u(z_1,z_2),v(z_1,z_2)\right) \left| \begin{array}{cc} \frac{du}{dz_1} & \frac{du}{dz_2}\\ \frac{dv}{dz_1} & \frac{dv}{dz_2} \end{array} \right|.$$

For your example: $Z_1=X$ and $Z_2=X+Y$, thus $u(z_1,z_2)=z_1$ and $v(z_1,z_2)=z_2-z_1$. The Jacobian matrix determinant needed is $$ \left| \begin{array}{cc} 1 & 0\\ -1 & 1 \end{array} \right|=1. $$ So $f_{X,Z}(x,z)=f_{X,Z}(x,x+y)=f_{X,Y}\left(x,y\right)$.