A quick Google search led me to this article: Estimating the Correlation in Bivariate Normal Data With Known Variances and Small Sample Sizes. In this article, they discussed several possible priors: "uniform", "Jeffreys", and "arc-sine". Specifically, take the "uniform" prior for example, it assumes that $\rho$ follows a uniform distribution on $[-1,1]$.
This leads to a density of the form:
$$
f(\rho | X, Y) \propto \prod\limits_i{P(x_i, y_i | \rho)}
$$
If a full Bayesian estimator is desired, then equation $\hat{\rho}^{(6)}$ on page 35 of that article needs to be computed. This can be done using a Monte Carlo integration.
Or, if one only needs the MAP estimate of $\rho$, it can be done using Gibbs sampling. I followed one citation of the first article: (Barnard 2000) Modeling Covariance Matrices in terms of
Standard Deviations and Correlations, with Application to Shrinkage. In this citation, it is discussed in detail how a uniform prior for $\rho$ is derived out of the Inverse-Wishart distribution for the covariance matrix $\Sigma$ and how Gibbs sampling was carried out. (Equation (8)).
I also tried searching for other references, but was unable to find possible conjugate priors for $\rho$ and approaches to get the Bayesian estimation of $\rho$ without doing numerical integration or sampling. It is an interesting problem to look at by the way.
As has been made clear in the comments, the OP is interested in the Likelihood ratio when the common variance is also estimated, and not known.
The joint density of one pair of $\{X_i, Y_i$}, given also the maintained assumptions on the parameter values is
$$ f(x_i,y_i) = \frac{1}{2 \pi \sigma^2\sqrt{3/4}} \ \exp\left\{
-\frac{2}{3}\left[
\frac{(x_i-\mu_x)^2}{\sigma^2} +
\frac{(y_i-\mu_y)^2}{\sigma^2} -
\frac{(x_i-\mu_x)(y_i-\mu_y)}{\sigma^2}
\right]
\right\}$$
So the joint Likelihood of the sample (not log likelihood) is
$$ L(\mu_x, \mu_y, \sigma^2 \mid, \mathbf x, \mathbf y, \rho=1/2) = \left(\frac{1}{2 \pi \sigma^2\sqrt{3/4}}\right)^n \\
\times \exp\left\{
-\frac{2}{3\sigma^2}\left[
\sum_{i=1}^n(x_i -\mu_x)^2 +\sum_{i=1}^n(y_i -\mu_y)^2
- \sum_{i=1}^n(x_i-\mu_x)(y_i-\mu_y) \right]
\right\}$$
Denote $L_1$ the maximized likelihood with the sample means (MLEs for the true means), and $L_0$ the likelihood with the means set equal to zero. Then the Likelihood Ratio (not the log such) is
$$ LR \equiv \frac {L_0}{L_1} = \frac {\hat \sigma^{2n}_1\cdot \exp\left\{
-(2/3\hat \sigma^2_0)\cdot\left[
\sum_{i=1}^nx_i^2 +\sum_{i=1}^ny_i^2
- \sum_{i=1}^nx_iy_i \right]
\right\}}{\hat \sigma^{2n}_0 \cdot\exp\left\{
-(2/3\hat \sigma^2_1)\cdot\left[
\sum_{i=1}^nx_i^2 -n\bar x^2 +\sum_{i=1}^ny_i^2 -n\bar y^2
- \sum_{i=1}^nx_iy_i+n\bar x\bar y \right]
\right\}}$$
where $\hat \sigma^2_1$ is the estimate with unconstrained means and $\hat \sigma^2_0$ is the estimate with the means constrained to zero.
The OP has (correctly) calculated the MLEs for the common variance as
$$\hat \sigma^2_1 = \frac{2}{3n} \sum_{i=1}^n \left[ \left(x_i-\bar{x}\right)^2+\left(y_i-\bar{y} \right)^2-\left(x_i-\bar{x}\right)\left(y_i-\bar{y} \right) \right]$$
$$\hat \sigma^2_0 = \frac{2}{3n} \sum_{i=1}^n \left( x_i^2+y_i^2-x_iy_i \right)$$
If we plug these into the LR, inside the exponential, both in the numerator and the denominator, things cancel out and we are left simply with
$$ LR = \frac {\hat \sigma^{2n}_1}{\hat \sigma^{2n}_0 } $$
Our goal is not to derive the LR per se -it is to find a statistic to run the test we are interested in. So let's consider the quantity (which is the reciprocal of quantity presented in the question)
$$\left(LR\right)^{-1/n} = \frac {\hat \sigma^{2}_0}{\hat \sigma^{2}_1}$$
$$ = \frac{2}{3n} \frac{\sum_{i=1}^{n}x_i^2+\sum_{i=1}^{n}y_i^2-\sum_{i=1}^n x_iy_i }{\hat \sigma^{2}_1}$$
$$= \frac {1}{3n}\cdot\left[\sum_{i=1}^n\left(\frac {x_i}{\hat \sigma_1}\right)^2 + \sum_{i=1}^n\left(\frac {y_i}{\hat \sigma_1}\right)^2 + \sum_{i=1}^n\left(\frac {x_i-y_i}{\hat \sigma_1}\right)^2\right]$$
Note that $\hat \sigma^{2}_1$ is a consistent estimator of the true variance, irrespective of whether the true means are zero or not. Also (given equal variances and $\rho =1/2$),
$$Z_i = X_i - Y_i \sim N(\mu_x-\mu_y, \sigma^2)$$
Under the null of zero means, then, all $(x_i/\hat \sigma_1)^2$, $(y_i/\hat \sigma_1)^2$ and $(z_i/\hat \sigma_1)^2$ are chi-squares with one degree of freedom (and i.i.d., per sum). Each sum (denote the three sums for compactness $S_x, S_y, S_z$) has expected value $n$ and standard deviation $\sqrt {2n}$ (under the null).
So subtract $n$ 3 times and add $n$ 3 times, and also divide and multiply by $\sqrt {2n}$ and re-arrange to get
$$\sqrt {n}\left(LR\right)^{-1/n} = \frac {\sqrt 2}{3}\cdot\left[\frac {S_x - E(S_x)}{SD(S_x)} + \frac {S_y - E(S_x)}{SD(S_x)} + \frac {S_z - E(S_z)}{SD(S_z)}\right] + 1$$
The three terms inside the bracket, are the subject matter of the Central Limit Theorem, and so each element converges to a standard normal. Therefore we have arrived (due to initial bi-variate normality) at
$$\frac {3}{\sqrt 2} \left[\sqrt{n}\left(LR\right)^{-1/n} -1\right] \xrightarrow{d} N(0, AV)$$
Of course in order to actually use the left-hand side as a statistic in a test, we need to derive the asymptotic variance -but for the moment, I do not feel up to the task. I just note that one should determine whether the three $S$'s are asymptotically independent or not.
Best Answer
Your original question does not require you to find the distribution of $XY$ when $(X,Y)$ is jointly normal. Here is a hint for that question:
Let $Z_i=X_iY_i$, so that $Z_1,Z_2,\ldots,Z_n$ are i.i.d variables. Hence by classical CLT we have
$$\frac{\sqrt n(W_n-\operatorname E(Z_1))}{\sqrt{\operatorname{Var}(Z_1)}}\stackrel{L}\longrightarrow N(0,1)\tag{*}$$
, where $W_n=\frac{1}{n} \sum\limits_{i=1}^n Z_i$.
Since we know the conditional distributions of a bivariate normal distribution, use law of total expectation to find $\operatorname E(Z_1)$:
$$\operatorname E(Z_1)=\operatorname E\left[\operatorname E(X_1Y_1\mid Y_1)\right]=\operatorname E\left[Y_1 \operatorname E(X_1\mid Y_1)\right]$$
Do this similarly for $\operatorname E(Z_1^2)$ and hence find $\operatorname{Var}(Z_1)=\operatorname{E}(Z_1^2)-[\operatorname{E}(Z_1)]^2$
Now $(*)$ actually gives you a pivot for constructing an asymptotic confidence interval for $\rho$.