We have $X_1\sim N(\mu_1,\sigma^2)$ and $X_2\sim N(\mu_2,\sigma^2)$, hence
$$EY_1=E(-X_1/\sqrt{2}+X_2/\sqrt{2})=-1/\sqrt{2}EX_1+1/\sqrt{2}EX_2=0$$
\begin{align*}
EY_1^2&=E(-X_1/\sqrt{2}+X_2/\sqrt{2})^2\\\\
&=E(X_1/\sqrt{2})^2-2E(X_1X_2/2)+E(X_2/\sqrt{2})^2\\\\
&=1/2\sigma^2+1/2\sigma^2=\sigma^2
\end{align*}
Hence $Y_1\sim N(0,\sigma^2)$ since it is the linear combination of normal variables.
Similarly we get $Y_2\sim N(0,\sigma^2)$ and $Y_3\sim N(0,\sigma^2)$
Now
$$EY_1Y_2=1/\sqrt{6}E(X_1)^2-1/\sqrt{6}EX_2^2=0$$
and similarly $EY_2Y_3=EY_1Y_3=0$, hence $Y_1$, $Y_2$ and $Y_3$ are independent, since for normal variables independece coincided with zero correlation.
Having established that we have
$$(Y_1^2+Y_2^2+Y_3^2)/\sigma^2=\left(\frac{Y_1}{\sigma}\right)^2+\left(\frac{Y_2}{\sigma}\right)^2+\left(\frac{Y_3}{\sigma}\right)^2=Z_1^2+Z_2^2+Z_3^2$$,
where $Z_i=Y_i/\sigma$. Since $Y_i\sim N(0,\sigma^2)$, we have $Z_i\sim N(0,1)$.
We have showed that our quantity of interest is a sum of squares of 3 independent standard normal variables, which by definition is $\chi^2$ with 3 degrees of freedom.
As I've said in the comments you do not need to calculate the densities. If you on the other hand want to do that, your formula is wrong. Here is why. Denote by $G(x)$ distribution of $Y_1^2$ and $F(x)$ the distribution of $Y_1$. Then we have
$$G(x)=P(Y_1^2<x)=P(-\sqrt{x}<Y_1<\sqrt{x})=F(\sqrt{x})-F(-\sqrt{x})$$
Now the density of $Y_1^2$ is $G'(x)$, so
$$G'(x)=\frac{1}{2\sqrt{x}}(F'(\sqrt{x})+F'(-\sqrt{x})$$
We have that
$$F'(x)=\frac{1}{\sigma\sqrt{2\pi}}e^{-\frac{x^2}{\sigma^2}},$$
so
$$G'(x)=\frac{1}{\sigma\sqrt{2\pi x}}e^{-\frac{x}{2}}$$
If $\sigma^2=1$ we have a pdf of $\chi^2$ with one degree of freedom. (Note that for $Z_1$ instead of $Y_1$ the calculation is similar and $\sigma^2=1$ ) As @whuber pointed out, this is gamma distribution, and sums of independent gamma distributions is again gamma, the exact formula is provided in the wikipedia page.
Your answer to (a) is correct.
For part (b), the two random variables are $F$-distributed by construction. We could prove that they're independent by establishing joint independence of $Y_1, Y_2, X_3$.
I believe you can do this via the joint moment-generating function. We have
$$
M_{Y_1,Y_2,X_3}(s,t,u) = \mathbb{E}[\exp(sY_1+tY_2+uX_3)]=\mathbb{E}[\exp(uX_3) \mathbb{E}[\exp(sY_1+tY_2|X_3)] ]
$$
We can drop the conditioning on $X_3$ in the inner expectation, as
$ (Y_1, Y_2) $ are constructed from $(X_1,X_2)$ only. This gives
$$
\mathbb{E}[\exp(sY_1+tY_2|X_3)]=\mathbb{E}[\exp(sY_1+tY_2)]=M_{Y_1,Y_2}(s,t)=M_{Y_1}(s)M_{Y_2}(t),
$$
by independence of $Y_1,Y_2$.
Putting it all together, we have
$$
M_{Y_1,Y_2,X_3}(s,t,u)=\mathbb{E}[\exp(uX_3) M_{Y_1}(s)M_{Y_2}(t)] =M_{Y_1}(s)M_{Y_2}(t)M_{X_3}(u)
$$
and we have joint independence.
Best Answer
Let $X=X_1,$ $Y=(X_2+X_3)/\sqrt2,$ and $Z=(X_2-X_3)/\sqrt2.$ As in your question, it is apparent that these are iid standard Normal. Moreover,
$$U = \frac{X^2}{Y^2}\ \text{ and }\ V = 2\frac{Z^2}{X^2 + Y^2}.$$
Consider, then, how $X/Y$ and $X^2+Y^2$ are related. The latter is the squared distance from the origin while the former is the cotangent of the angle $\Theta$ made by $(X,Y)$ to the $X$-axis. Because iid Normal variables are spherical, the angle and the distance are independent.
Recall that any (measurable) functions of independent variables are independent. Call these functions $f$ and $g.$ Beginning with the variables $\Theta$ and $(R, Z) = (\sqrt{X^2+Y^2}, Z),$ we have just seen $(\Theta,R,Z)$ are independent. Observing that we may express $V = 2Z^2/R^2 = g(R,Z)$ and $U = \cot^2(\Theta) = f(\Theta),$ we immediately see $(U,V)$ are independent.
In retrospect, it is evident $(\Theta, R, Z)$ is a cylindrical coordinate system for the original Cartesian coordinates $(X_1,X_2,X_3).$ Thus, if you prefer an explicitly rigorous, Calculus-based derivation, consider computing the joint distribution function in these cylindrical coordinates: it should separate into a term for $\theta$ and a term for $(r,z).$
BTW, this is a challenge for those of us who like to draw inspiration from simulations: both $U$ and $V$ have infinite means and, in even fairly large simulations, the $(U,V)$ scatterplot can look decidedly dependent. Here, for instance, is a log-log plot for 4,000 random $(U,V)$ pairs.
The lack of many extreme values of $U$ makes it look like $V$ tends to be high when $U$ is extreme.
Here's the
R
simulation code.