Probability – Proving Independence of Two Random Variables with F-Distribution

f distributionindependenceprobabilityself-study

The following is an exercise problem from Hogg, McKean and Craig's book titled "Introduction to Mathematical Statistics" that I am working on.

3.6.16. Let $X_1$, $X_2$, and $X_3$ be three independent chi-square variables with $r_1$, $r_2$, and $r_3$ degrees of freedom, respectively.

(a) Show that $Y_1 = X_1/X_2$ and $Y_2 = X_1 + X_2$ are independent and that $Y_2$ is
$\chi^2(r_1 + r_2)$.

(b) Deduce that
$$\cfrac{X_1 /r_1}{X_2/r_2} \ \ \textrm{ and } \ \ \cfrac{X_3 /r_3}{(X_1 + X_2)/(r_1 + r_2)}$$ are independent F-variables.

My attempt:

(a) The inverse of the transformations is given by $X_1 = Y_1Y_2/(1+Y_1)$ and $X_2=Y_2/(1+Y_1)$ and the Jacobian is the determinant of
$$ \begin{bmatrix} \frac{\partial x_1}{\partial y_1} & \frac{\partial x_1}{\partial y_2} \\ \frac{\partial x_2}{\partial y_1} & \frac{\partial x_2}{\partial y_2} \end{bmatrix} = \begin{bmatrix} y_2/(1+y_1)^2 & y_1/(1+y_1) \\ -y_2/(1+y_1)^2 & 1/(1+y_1) \end{bmatrix} $$ which ends up being $y_2/(1+y_1)^2$. The joint pdf of $Y_1$ and $Y_2$ is then given by

\begin{align}f_{Y_1,Y_2}(y_1,y_2) &= \cfrac{y_2}{(1+y_1)^2} \left( \cfrac{y_1y_2}{1+y_1} \right)^{\left(\frac{r_1}{2}-1\right)}\cfrac{e^{-\left(\frac{y_1y_2}{2(1+y_1)}\right)}}{2^{\frac{r_1}{2}}\Gamma(\frac{r_1}{2})}\left( \cfrac{y_2}{1+y_1} \right)^{\left(\frac{r_2}{2}-1\right)}\cfrac{e^{-\left(\frac{y_2}{2(1+y_1)}\right)}}{2^{\frac{r_2}{2}}\Gamma(\frac{r_2}{2})} \\ &= \cfrac{1}{2^{\left(\frac{r_1+r_2}{2}\right)}\Gamma(\frac{r_1}{2})\Gamma(\frac{r_2}{2})} \times
\left[ \frac{y_1^{\left( \frac{r_1}{2}-1\right)}}{(1+y_1)^{\left(\frac{r_1+r_2}{2} \right)}}\right]
\times\left[y_2^{\left(\frac{r_1+r_2}{2}-1\right)}e^{-\left(\frac{y_2}{2}\right)}\right]\end{align}
.

With that, the pdf nicely splits into a product of a function in $y_1$ alone, and that of $y_2$ alone thereby letting us to conclude that $Y_1$ and $Y_2$ are independent. We need to be careful about the support before we conclude that but as the support is that of positive real numbers for both of them, we should not have a problem on that front.

(b) Let $Y_3 = X_3/(X_1+X_2)$. We need to show that $Y_1$ and $Y_3$ are independent. I will spare the details but taking an approach similar to the one above, we get the Jacobian to be $y_2^2/(1+y_1)^2$ and the joint pdf to be

\begin{align}f_{Y_1,Y_2,Y_3}(y_1,y_2,y_3) &= \cfrac{1}{2^{\left(\frac{r_1+r_2+r_3}{2}\right)}\Gamma(\frac{r_1}{2})\Gamma(\frac{r_2}{2})\Gamma(\frac{r_3}{2})} \times
\left[ \frac{y_1^{\left( \frac{r1}{2}-1\right)}y_3^{\left(\frac{r_3}{2}-1\right)}}{(1+y_1)^{\left(\frac{r_1+r_2}{2} \right)}}\right]\times
\left[y_2^{\left(\frac{r_1+r_2+r_3}{2}-1\right)}e^{-\left(\frac{y_2(1+y_3)}{2}\right)}\right].\end{align}

Clearly, $Y_1$, $Y_2$ and $Y_3$ are not jointly independent. However, when we integrate out $y_2$ over its support to obtain the marginal pdf $f_{Y_1,Y_3}(y_1,y_3)$, the expression in the second square braces integrates to some kind of Gamma function in $y_3$ along with possibly another factor involving $2$ rasied to a power of $y_3$, which will let us conclude that $Y_1$ and $Y_3$ are independent.

This still lacks details but I think I have explained my main thought process above. Here are my questions to you.

  1. Do you find any error in this thought process?

  2. More importantly, is there a more elegant/sleek solution to part (b) that uses part (a) and some other result? Please note that $Y_1$, $Y_2$, and $Y_3$ are not mutually independent. If they were, part (b) would have been quite trivial.

I am asking this question mainly because in part (b) he starts with "Deduce that …" by which I feel like there might be a one liner to deduce that $Y_1$ and $Y_3$ are independent. Is there a simpler solution? Please let me know.

Best Answer

Your answer to (a) is correct.

For part (b), the two random variables are $F$-distributed by construction. We could prove that they're independent by establishing joint independence of $Y_1, Y_2, X_3$.

I believe you can do this via the joint moment-generating function. We have $$ M_{Y_1,Y_2,X_3}(s,t,u) = \mathbb{E}[\exp(sY_1+tY_2+uX_3)]=\mathbb{E}[\exp(uX_3) \mathbb{E}[\exp(sY_1+tY_2|X_3)] ] $$ We can drop the conditioning on $X_3$ in the inner expectation, as $ (Y_1, Y_2) $ are constructed from $(X_1,X_2)$ only. This gives $$ \mathbb{E}[\exp(sY_1+tY_2|X_3)]=\mathbb{E}[\exp(sY_1+tY_2)]=M_{Y_1,Y_2}(s,t)=M_{Y_1}(s)M_{Y_2}(t), $$ by independence of $Y_1,Y_2$.

Putting it all together, we have $$ M_{Y_1,Y_2,X_3}(s,t,u)=\mathbb{E}[\exp(uX_3) M_{Y_1}(s)M_{Y_2}(t)] =M_{Y_1}(s)M_{Y_2}(t)M_{X_3}(u) $$ and we have joint independence.

Related Question