# [Math] Box-Muller Independence Proof by Change of Variables (Help finding the Inverses)

inverserandom variablesself-learningstatistics

Let $X_1=\cos(2 \pi U_1)\sqrt{-2 \log(U_2)}$ and $X_2=\sin(2 \pi U_1)\sqrt{-2 \log(U_2)}$ wher $U_1$ and $U_2$ are iid uniform (0,1). Prove that $X_1$ and $X_2$ are independent N(0,1) random variables.

So my approach (and one of the proofs I was shown) did this though change of variables where
$g_1(u_1,u_2)=\cos(2 \pi U_1)\sqrt{-2 \log(U_2)}$ and $g_2(u_1,u_2)=\cos(2 \pi U_1)\sqrt{-2 \log(U_2)}$ and essentially found their inverses $g_1^{-1}(x_1,x_2)$ and $g_2^{-1}(x_1,x_2)$, and used those to find the Jacobian (essentially here Box-Muller Transform Normality)

I'm wondering how they found those inverses? The rest of the proof I get, but I am not as familiar with a multi-variable inverse.

Given that $X_1=\cos(2 \pi U_1)\sqrt{-2 \log(U_2)}$ and $X_2=\sin(2 \pi U_1)\sqrt{-2 \log(U_2)}$, to find their inverses it is necessary to express $U_1$ and $U_2$ in terms of $X_1$ and $X_2$.
To isolate $U_1$ by eliminating $U_2$ simply divide $X_2$ by $X_1$, resulting in $$\frac{X_1}{X_2}=\tan(2\pi U_1)\Rightarrow U_1=\frac{1}{2\pi}\arctan\left(\frac{X_2}{X_1}\right)$$ To isolate $U_2$ by eliminating $U_1$ simply perform the following operation $$X_1^2+X_2^2=\sqrt{-2\log(U_2)}\Rightarrow U_2=\exp\left(-\frac{1}{2}(X_1^2+X_2^2)\right)$$ The Jacobian is then calculated by evaluating $\frac{\partial U_1}{\partial X_1}$, $\frac{\partial U_1}{\partial X_2}$, $\frac{\partial U_2}{\partial X_1}$ and $\frac{\partial U_2}{\partial X_2}$.