[Math] how to apply change of variable formula to find joint distribution of 2 random variables given the joint distribution of 3 random variables

inequalityintegrationprobability

enter image description here

I have trouble with find the joint distribution of $X_1,X_2$.

I know I need to use the change of variable formula and the first them is to express $Y_1,Y_2,Y_3$ in terms of $X_1,X_2,X_3$, where extra is just another random variable I needed to add and $X_3=Y_1$.

So, solving simultaneously, I get:
$$ y_1=x_3$$
$$ y_2=\frac{x_3x_2-x_3x_1}{x_1}$$
$$ y_3 = \frac{x_3-x_3x_2}{x_1}$$

But when I compare this with the given solution:
enter image description here

Looks like I am close to the provided solution but I don't know what else I can do to get the exact solution I doubt that I made any mistakes in my workings. Please help.

Also, what was the argument(s) used to determine the support of $f_{x_1,x_2,x_3}$? I know from my workings, where $x_3=y_1$, I can let $x_3>0$ because $y_1$ follows an exponential distribution. But I am not sure how to argue for $x_1,x_2$ because they are functions of 3 variables and this is the first time I am seeing this kind of question.

EDIT:

The rest of the solution is as follows:
enter image description here

EDIT 2:
If I start with assuming that $\lambda>0$:

The integral I need to evaluate will be:

$\int\limits_0^\infty \lambda^3 e^{-\lambda s}s^2 \,ds = \lambda^3 \int\limits_0^\infty e^{-\lambda s}s^2 \,ds \tag{1}$

So,

$$
\begin{align}
\int e^{-\lambda s}s^2 \,ds & = \frac{-s^2e^{-\lambda s}}{\lambda}+\frac{2}{\lambda}\int 2s \frac{e^{-\lambda s}}{\lambda} \,ds \text{ ,integration by parts.} \\
& = \frac{-s^2e^{-\lambda s}}{\lambda} -\frac{2se^{-\lambda s}}{\lambda^2} – \frac{2e^{-\lambda s}}{\lambda^3} \text{ ,integration by parts again and then simplify} \tag{2}
\end{align}
$$

Evaluating (2) from $0$ to $\infty$ gives:
$$ \begin{align}
\int\limits_0^\infty e^{-\lambda s}s^2 \,ds & = [0-0] – [0-0] – \left[0-\frac{2}{\lambda^3} \right] \\
&= \frac{2}{\lambda^3}, \tag{3}
\end{align} $$

Putting (3) and (1) together gives:
$$ \begin{align}
\int\limits_0^\infty \lambda^3 e^{-\lambda s}s^2 \,ds & = \lambda^3 \int\limits_0^\infty e^{-\lambda s}s^2 \,ds \\
& = \lambda^3 \frac{2}{\lambda^3} \\
& = 2
\end{align}$$

Which gives the desired result independent of the the value of $\lambda$.

Best Answer

Suppose that $Y_i\sim Exp(\lambda=1)$

We denote by $S$ the sum $Y_1+Y_2+Y_3$.

The joint pdf of the random variables $Y_1, Y_2, S$ is: $$\begin{array}{l}f_{Y_1, Y_2, S}(y_1, y_2, s)=f_{Y_1, Y_2}(y_1,y_2)f_{S|Y_1=y_1, Y_2=y_2}(s)=\\ =e^{-y_1}e^{-y_2}f_{Y_3}(s-y_1-y_2)=e^{-y_1-y_2}e^{-s+y_1+y_2}=e^{-s}\end{array}$$ for $y_1\geq 0, y_2\geq 0, s\geq y_1+y_2$. Here $f_{S|Y_1=y_1, Y_2=y_2}$ stands for the conditional pdf of the rv S, under conditions $Y_1=y_1$, $Y_2=y_2$, which amounts to the pdf of $Y_3$ evaluated at $s-y_1-y_2$.

In order to get the joint distribution of the random variables $X_1=\displaystyle\frac{Y_1}{S}, X_2=\displaystyle\frac{Y_1+Y_2}{S}, S$ we consider the transformation $\psi: (y_1, y_2, s)\mapsto (x_1, x_2, s)$, with: $$ x_1=y_1/s,\quad x_2=(y_1+y_2)/s, \quad s=s$$

Hence $\psi^{-1}(x_1, x_2, s)=(y_1, y_2, s)$ is defined by:

$$y_1=sx_1, \quad y_2=-sx_1+sx_2, \quad s=s$$ and its Jacobian is:

$$\left|\begin{array}{rrr}s&0&x_1\\ -s&s &x_2-x_1\\0&0&1\end{array}\right|=s^2$$

Thus the joint distribution of the variables $X_1, X_2, S$ is: $g(x_1, x_2, s)=e^{-s}s^2$, for $sx_1, sx_2-sx_1\geq 0$ and $s\geq sx_1+sx_2-sx_1$. But these conditions amount to: $x_1, x_2\geq 0$, $x_1\leq x_2\leq 1$.

To get the joint density distribution, $h_{X_1, X_2}(x_1, x_2)$, of the random variables $X_1,X_2$ we integrate with respect to $s$ the density $g(x_1, x_2, s)$: $$ h_{X_1, X_2}(x_1, x_2)=\int_0^\infty s^2 e^{-s}ds=2! \quad x_1, x_2\in[0,1], \quad x_1\leq x_2$$
But this is just the joint distribution of the order statistics, as you stated in your question.

If you start with $Y_k\sim Exp(\lambda)$, $k=1,2,3$, with $\lambda>0$ arbitrary you get that the joint distribution of the variables $X_1, X_2$ coincides with that of the order statistics $U_{(1)}, U_{(2)}$, where $U_1, U_2\sim Unif[0,1]$, iff $\lambda=1$.

Related Question