[Math] Prove that random variables are independent

probability

$X_1,X_2,X_3$ are independent random variables with with exponential distribution with parameter $\lambda=1$. I'd like to prove that variables $\frac{X_1}{X_2 +X_1}, \frac{X_1 +X_2}{X_1 + X_2 +X_3},X_1 + X_2 +X_3$ are independent too. I can calculate the distribution od those variables (for example $\frac{X_1}{X_2 +X_1} \cong 1_{[0,1]}(x)$), but I can't really go any further. I know I have to prove that density of multivariable $\left(\frac{X_1}{X_2 +X_1}=A, \frac{X_1 +X_2}{X_1 + X_2 +X_3}=B,X_1 + X_2 +X_3=C\right)$ is a product of densities of coordinates and surely it is possible to do it with Fubini theorem :
$P(A <t, B<s, C< w) = P( (X_1,X_2,X_2) \in D \subset R^{3})$ and use the independence of $(X_1,X_2,X_3)$ but it gets very messy and thus I believe there must be a nicer way to do it. Thank you for any hints.

Best Answer

Use the Jacobian formula. If $X, Y, Z$ are iid exponential($\lambda$), then we obtain $U,V,W$ by applying the transformation $(x,y,z)\mapsto(u,v,w)$ defined by $$\begin{align} u&=\frac x{x+y}\\ v&=\frac{x+y}{x+y+z}\\ w&=x+y+z.\end{align}$$ Invert this mapping to obtain $$\begin{align}x&=uvw\\y&=(1-u)vw\\ z&=(1-v)w.\end{align} $$ Next, compute the Jacobian determinant $$ J(u,v,w):=\det\frac{\partial(x,y,z)}{\partial(u,v,w)}=\left| \begin{array}{lll} vw&-vw&0\\ uw&(1-u)w&-w\\ uv&(1-u)v&1-v\\ \end{array} \right|=vw^2 $$ so the joint density of $(U,V,W)$ is $$ f_{U,V,W}(u,v,w)=f_{X,Y,Z}(x,y,z)\left|J(u,v,w)\right|=\lambda^3e^{-\lambda(x+y+z)}\cdot vw^2=1\cdot 2v\cdot\frac{\lambda^3}2w^2e^{-\lambda w} $$ which factors into a product of the marginal densities. Note that $U$ and $V$ take values in the unit interval $[0,1]$ while $W>0$. [Specifically, $W$ has Gamma($k=3,\lambda$) distribution.]

Related Question