Let $T=-\log(X_1 \cdots X_n).$ This is positive with probability $1$. The likelihood-ratio test statistic as derived in the question is an increasing function of $T^n \exp(-\theta_0 T).$ This has its largest possible value when $T = n/\theta_0 >0$ and it decreases as $T$ moves away from $n/\theta_0$ in either direction, approaching $0$ as $T\downarrow 0$ or $T\to\infty.$ Thus you reject $H_0$ if $T$ is either too big or too small. If you find $c_1$ and $c_2$ so that $\Pr(T<c_1) = \alpha/2 = \Pr(T>c_2),$ then you have a test at the level that you want.
Assuming you observe $\mathbf Y=(Y_1,Y_2,\ldots,Y_n)$ where $Y_i\sim N(\theta x_i,1)$ independently for all $i$ and $x_i$ is fixed.
Indeed, MLE of $\theta$ is given by $$\hat\theta(\mathbf Y)=\frac{\sum_{i=1}^n x_i Y_i}{\sum_{i=1}^n x_i^2}$$
By the reproductive property of normal distribution, we have an exact distribution for the MLE:$$\hat\theta\sim N\left(\theta,\frac{1}{\sum_{i=1}^n x_i^2}\right)$$
In other words, $$\sqrt{\sum_{i=1}^n x_i^2}\left(\hat\theta-\theta\right)\sim N(0,1)$$
Using this pivot, a $100(1-\alpha)\%$ confidence interval for $\theta$ is $$I=\left[\hat\theta-\frac{z_{\alpha/2}}{\sqrt{\sum_{i=1}^n x_i^2}},\hat\theta+\frac{z_{\alpha/2}}{\sqrt{\sum_{i=1}^n x_i^2}}\right]$$
That is, $$P_{\theta}[\theta\in I]=1-\alpha\quad,\forall\,\theta$$
Or, $$P_{\theta}[\theta\in I^c]=\alpha\quad,\forall\,\theta$$
So for some $\theta_0$, $$P_{\theta_0}[\theta\in I^c]=\alpha$$
This gives the following critical region of a size $\alpha$ test for testing $H_0:\theta=\theta_0$ against $H_1:\theta\ne\theta_0$:
$$\left\{\mathbf Y:\hat\theta(\mathbf Y)<\theta_0-\frac{z_{\alpha/2}}{\sqrt{\sum_{i=1}^n x_i^2}}\quad\text{ or }\quad \hat\theta(\mathbf Y)>\theta_0+\frac{z_{\alpha/2}}{\sqrt{\sum_{i=1}^n x_i^2}}\right\}$$
Other tests can be derived of course but this gives you a test directly using the confidence interval $I$.
Best Answer
Firstly we should have $\frac{\theta_{\color{red}0}^n}{\overline X^{-n}}=\theta_{0}^n\cdot \overline X^{n}=\left(\theta_{0}\overline X\right)^n$
Similar for the next fraction.
$$\Large{\frac{ e^{-\theta_0 \sum\limits_{i=1}^n X_i}}{e^{-\sum\limits_{i=1}^n {\frac{X_i}{\bar{X}}}}}}\normalsize= e^{-\theta_0 \sum\limits_{i=1}^n X_i+\sum\limits_{i=1}^n {\frac{X_i}{\bar{X}}}}$$
Let´s focus on the exponent.
$\frac1n\sum\limits_{i=1}^n X_i=\overline X\Rightarrow \sum\limits_{i=1}^n X_i=n\cdot \overline X$
Therefore $\sum\limits_{i=1}^n {\frac{X_i}{\bar{X}}}=\frac1{\overline X}\cdot \sum\limits_{i=1}^n X_i=n$
Consequently
$-\theta_0 \sum\limits_{i=1}^n X_i+\sum\limits_{i=1}^n {\frac{X_i}{\bar{X}}}=-\theta_0\cdot n\cdot \overline X+n$