Bayes’ theorem and law of total probability with CDFs

conditional probabilitycumulative-distribution-functionsgamma distributionprobability

Suppose $X$ has Gamma(2, λ) distribution, and the conditional distribution of $Y$ given $X = x$ is uniform on $(0, x).$
Find the joint density function of $X$ and $Y,$ the marginal density function of $Y,$ and the conditional density function of $X$ given $Y = y$? How would you describe the distribution
of $X$ given $Y = y$? Use this to describe the joint distribution of $Y$ and $X − Y.$

So far, I am stuck with trying to use Bayes' theorem and the law of total probability with CDFs. Would it be right to write something like

$$
P(X\leq x, Y \leq y) = \int_{-\infty}^x P(Y \leq y \mid X \leq k) P(X \leq k) \, dk \text{?}
$$

Best Answer

Density functions, not c.d.f.s, are what it makes sense to integrate.

You have $$ f_{Y\,\mid \, X=x}(y) = \begin{cases} 1/x & \text{if } 0<y<x, \\ 0 & \text{otherwise.} \end{cases} $$ As a function of $y,$ this is a probability density function, but as a function of $x,$ it is the likelihood function. Bayes's theorem says that if you multiply the likelihood function by the prior probability density function and then normalize, you get the posterior probability density function.

The prior density is $$ f_X(x) \propto x^{2-1} e^{-\lambda x}. $$ Multiplying, we get \begin{align} f_{X\,\mid\,Y=y} (x) & \propto \begin{cases} \frac 1 x \cdot x^{2-1} e^{-\lambda x} & \text{if } x>y, \\ 0 & \text{if } x<y, \end{cases} \\[10pt] & = \begin{cases} e^{-\lambda x} & \text{if } x>y, \\ 0 & \text{if }x<y. \end{cases} \end{align} (Here, by $\text{“}{\propto}\text{''}$ I mean proportional as a function of $x,$ not as a function of anything else.)

So we have a shifted exponential distribution, defined on the interval $x\in(y,+\infty).$ To get the normalizing constant we integrate: $$ \int_y^\infty e^{-\lambda x} \, dx = \frac{e^{-\lambda y}} \lambda, $$ and so $$ f_{X\,\mid\,Y=y} (x) = \lambda e^{\lambda(y-x)} \text{ for } x>y. $$ From this we can see that the conditional distribution of $X-Y$ given $Y=y$ is just an exponential distribution: $$ e^{-\lambda x} (\lambda \,dx) \text{ for } x>0. $$ Since the conditional distribution of $X-Y$ given $Y=y$ does not depend on $y,$ we conclude first that $Y$ and $X-Y$ are independent, and second that the marginal distribution of $X-Y$ is the same as its conditional distribution given $Y=y,$ which we just found.

Related Question