Indeed, $f_{A,B}(a,b)=(2\pi)^{-1}\mathbf 1_{0\lt a\lt1}\mathbf 1_{0\lt b\lt2\pi}$ and $(X,Y)=(A\cos B,A\sin B)$ has density $f_{X,Y}$ where, for every $(x,y)$, $f_{X,Y}(x,y)=(2\pi)^{-1}(x^2+y^2)^{-1/2}\mathbf 1_{0\lt x^2+y^2\lt1}$.
Starting from some independent $A$ and $B$ with densities $f_A$ on $[0,1]$ and $f_B$ respectively, where $f_B$ is still defined by $f_B(b)=(2\pi)^{-1}\mathbf 1_{0\lt b\lt2\pi}$, a similar computation yields $$f_{X,Y}(x,y)=(2\pi)^{-1}f_A((x^2+y^2)^{1/2})(x^2+y^2)^{-1/2}\mathbf 1_{0\lt x^2+y^2\lt1}.$$ This is the uniform distribution on the unit disk if $f_A:a\mapsto2a\mathbf 1_{0\lt a\lt1}$.
To achieve this, one can use the original uniform random variables $A$ and $B$ and consider $$(X,Y)=(\sqrt{A}\cos B,\sqrt{A}\sin B).$$
I think a more easy to follow (and simpler) proof would be to use a different change of variables.
We have the joint density of the order statistics $(U_1=X_{(1)},\cdots,U_n=X_{(n)})$
$$f_{\mathbf U}(u_1,\cdots,u_n)=n!\exp\left[-\sum_{i=1}^nu_i+n\theta\right]\mathbf1_{\theta<u_1<u_2<\cdots<u_n}$$
Now transform $(U_1,\cdots,U_n)\to(Y_1,\cdots,Y_n)$ such that $Y_i=(n-i+1)(U_i-U_{i-1})$ for all $i=1,2\cdots,n$ and taking $U_0=\theta$.
It follows that $\sum_{i=1}^nu_i=\sum_{i=1}^ny_i+n\theta$. The jacobian determinant comes out as $n!$.
So you get the joint density of $(Y_1,\cdots,Y_n)$
$$f_{\mathbf Y}(y_1,\cdots,y_n)=\exp\left[-\sum_{i=1}^ny_i\right]\mathbf1_{y_1,\cdots,y_n>0}$$
Not surprisingly, the spacings of successive order statistics from an exponential sample come out as independent . In fact, the $Y_i$'s are i.i.d exponential with mean $1$ for all $i=1,2,\cdots,n$.
This implies $2Y_i\stackrel{\text{i.i.d}}{\sim}\chi^2_2$ for all $i=1,2,\cdots,n$
So we have two independent variables $2Y_1$ and $\sum_{i=2}^n2Y_i$. Both have the chi-square distribution --- the former with $2$ degrees of freedom and the latter with $2n-2$ degrees of freedom.
It is now a matter of time to see that $2Y_1=2n(X_{(1)}-\theta)$ and $2\sum_{i=2}^nY_i=2\sum_{i=2}^n(X_{(i)}-X_{(1)})$.
Best Answer
Note that $P_{X,Y}(X=x \land Y=y)=P_{R,\theta}(R=\sqrt{x^2+y^2} \land \tan\theta=\frac{y}{x})$. Now since $R$ and $\theta$ are independent we have
$$f_{X,Y}(x,y)dxdy=f_R(R=r).\frac{1}{2\pi} drd\theta$$ Now since $dxdy=rdrd\theta$, we have the desired result.