As you write,
$$Y_i = X_i/\theta \Rightarrow Y_i \sim U(0,1), \;\; -2\ln(Y_i) \sim \chi_{(2)}^2$$
Consider
$$Z_n=\left(\prod_{i=1}^nX_i\right)^{1/n} = \left(\prod_{i=1}^n(\theta Y_i)\right)^{1/n}$$
$$\ln Z_n = \frac 1n \sum_{i=1}^n \left(\ln \theta + \ln Y_i\right) = \ln \theta + \frac 1n \sum_{i=1}^n \ln Y_i$$
$$\Rightarrow 2\ln \theta - 2\ln Z_n = \frac 1n \sum_{i=1}^n (-2\ln Y_i)$$
The sum of independent ch-squares is a chi-square also, having as degrees of freedom the sum of the degrees of freedom of the components. So
$$\sum_{i=1}^n (-2\ln Y_i) = W \sim \chi_{(2n)}^2$$
Then also,
$$G=\frac 1n W \sim Gamma (k= n, m = 2/n)$$
(shape-scale parametrization). This Gamma distribution has mean $E(G) = km = 2$ and variance $\operatorname{Var}(G) = km^2 = 4/n$. Centered and scaled, this r.v. converges to a standard normal, as $k=n \rightarrow \infty$
$$\frac {G-2}{\sqrt{4/n}} \rightarrow_d N(0,1),\;\; n\rightarrow \infty$$
But we also have
$$\frac {G-2}{\sqrt{4/n}} = \sqrt{n}\frac {2\ln \theta - 2\ln Z_n-2}{2} = \sqrt{n}\left(\ln \theta - \ln Z_n-1\right)$$
So the random variable
$$Q_n = \sqrt{n}\left(\ln \theta - \ln Z_n-1\right) \rightarrow_d Q \sim N(0,1)$$
Then by the continuous mapping (Mann-Wald) theorem
$$ Q_n^2 \rightarrow_d Q^2 \sim \chi_{(1)}^2$$
Best Answer
$\newcommand{\E}{\mathrm{E}}$ $\newcommand{\Var}{\mathrm{Var}}$ $\newcommand{\cov}{\mathrm{Cov}}$ $\newcommand{\Expect}{{\rm I\kern-.3em E}}$
As a direct consequence of the definition of covariance, $\cov (X,Y)= \E(XY)-\E(X)\E(Y)$.
Fact 1:
$U, V \overset{i.i.d.}{\sim} \mathcal{N}(0,1)$
$\Rightarrow U - V \sim \mathcal{N}(0,2)$ (sum of normally distributed random variables)
$ \Rightarrow |U - V|$ is a half-normal random variable with parameter $\sigma = \sqrt2$
$ \Rightarrow \E (|U - V|) = \frac{\sigma\sqrt{2}}{\sqrt{\pi}} = \frac{\sqrt{2}\sqrt{2}}{\sqrt{\pi}} = \frac{2}{\sqrt{\pi}}$
Fact 2:
$\E(X)+\E(Y) = \E(X+Y)$ (linearity of the expectation). We have $\E(X+Y) = \E (\min(U,V)+\max(U,V))= \E(U+V) = \E(U)+\E (V) = 0 + 0 = 0$. As a result, $\E(Y) = -\E(X)$.
Fact 3:
Since $Y-X = |U - V|$:
$2\E(Y) = \E(Y)-\E(X) = \E(Y-X) = \E (|U - V|)= \frac{2}{\sqrt{\pi}}$, hence $\E(Y)= \frac{2}{2\sqrt{\pi}}= \frac{1}{\sqrt{\pi}}$
Fact 4:
Since $XY=UV$, we have $\E(XY)=\E(UV)=\E (U)\E (V)=0$
Using these facts: $\cov (X,Y)= \E(XY)-\E(X)\E(Y)= 0 + \E(Y)\E(Y) = \frac{1}{\sqrt{\pi}}\frac{1}{\sqrt{\pi}}=\frac{1}{\pi}$.