Assuming you observe $\mathbf Y=(Y_1,Y_2,\ldots,Y_n)$ where $Y_i\sim N(\theta x_i,1)$ independently for all $i$ and $x_i$ is fixed.
Indeed, MLE of $\theta$ is given by $$\hat\theta(\mathbf Y)=\frac{\sum_{i=1}^n x_i Y_i}{\sum_{i=1}^n x_i^2}$$
By the reproductive property of normal distribution, we have an exact distribution for the MLE:$$\hat\theta\sim N\left(\theta,\frac{1}{\sum_{i=1}^n x_i^2}\right)$$
In other words, $$\sqrt{\sum_{i=1}^n x_i^2}\left(\hat\theta-\theta\right)\sim N(0,1)$$
Using this pivot, a $100(1-\alpha)\%$ confidence interval for $\theta$ is $$I=\left[\hat\theta-\frac{z_{\alpha/2}}{\sqrt{\sum_{i=1}^n x_i^2}},\hat\theta+\frac{z_{\alpha/2}}{\sqrt{\sum_{i=1}^n x_i^2}}\right]$$
That is, $$P_{\theta}[\theta\in I]=1-\alpha\quad,\forall\,\theta$$
Or, $$P_{\theta}[\theta\in I^c]=\alpha\quad,\forall\,\theta$$
So for some $\theta_0$, $$P_{\theta_0}[\theta\in I^c]=\alpha$$
This gives the following critical region of a size $\alpha$ test for testing $H_0:\theta=\theta_0$ against $H_1:\theta\ne\theta_0$:
$$\left\{\mathbf Y:\hat\theta(\mathbf Y)<\theta_0-\frac{z_{\alpha/2}}{\sqrt{\sum_{i=1}^n x_i^2}}\quad\text{ or }\quad \hat\theta(\mathbf Y)>\theta_0+\frac{z_{\alpha/2}}{\sqrt{\sum_{i=1}^n x_i^2}}\right\}$$
Other tests can be derived of course but this gives you a test directly using the confidence interval $I$.
The given density is a Rayleigh. If a sufficient estimator exists, the test must be based on this estimator.
It is very easy to verify, via factorization theorem, that this sufficient statistic is $T=\sum_{i} X_i^2$
Now let's derive the density of $Y=X^2$
Via fundamental transformation theorem you find
$$f_Y(y)=\frac{\sqrt{y}}{\theta}e^{-\frac{y}{2\theta}}\frac{1}{2\sqrt{y}}=\frac{1}{2\theta}e^{-\frac{y}{2\theta}}\sim Exp(\frac{1}{2\theta})=Gamma(1;\frac{1}{2\theta})$$
Now
$$\sum_i X_i^2 \sim Gamma (n;\frac{1}{2\theta})$$
And concluding...
$$\frac{1}{\theta}\sum_i X_i^2\sim \chi_{(2n)}^2$$
To find the critical region, first observe that $\theta_0 < \theta_1$ and
$$\frac{L(\theta_0|\mathbf{x})}{ L(\theta_1|\mathbf{x}) }\propto e^{(\frac{1}{2\theta_1}-\frac{1}{2\theta_0 })\sum_iX_i^2}$$
It is evident that LR is a decreasing function of $T=\sum_iX_i^2$.
Now you can apply Theorem 9.6 taken from Mood Graybill Boes and define the critical region
$$C=\{\mathbf{x}:\sum_iX_i^2>k\}$$
getting a size $\alpha$ UMP Test for $\mathcal{H}_0:\theta \leq \theta_0$ against $\mathcal{H}_1:\theta > \theta_0$ using a chi-square distribution as showed above.
Best Answer
\begin{align} & \sum_i \left( 2x_i y_i \left(\theta_0 - \widehat{\theta\,}\right) + \left(\widehat{\theta\,}^2 - \theta_0^2\right)x_i^2 \right) \\[10pt] = {} & \underbrace{\left( -2\theta_0 \sum_i x_i y_i + \theta_0^2 \sum_i x_i^2\right)}_\text{first term} {} - {}\underbrace{ \left(- 2\widehat{\theta\,} \sum_i x_i y_i + \widehat{\theta\,}^2 \sum_i x_i^2\right)}_\text{second term} \end{align} The first and second terms above have the form $$ -2\theta\sum_i x_i y_i + \theta^2 \sum_i x_i^2. $$ That function is a positive number multiplied by $$ -2\theta \frac{\sum_i x_i y_i}{\sum_i x_i^2} + \theta^2, \tag 1 $$ and that positive number is not a function of $\theta.$ And $(1)$ is equal to $$ \left( \frac{\sum_i x_i y_i}{\sum_i x_i^2} \right)^2 - 2\theta \frac{\sum_i x_i y_i}{\sum_i x_i^2} + \theta^2 + \big(\text{something} \big) \tag 2 $$ where the term labeled "something" simply cancels out the term added at the beginning, and the important thing to notice about it is that it also does not depend on $\theta.$ Line $(2)$ is equal to $$ \left( \frac{\sum_i x_i y_i}{\sum_i x_i^2} - \theta \right)^2 + \big( \text{something} \big) $$ where "something" is the same thing it was in line $(2).$
So the "first term" and the "second term" above add up to $$ \left( \frac{\sum_i x_i y_i}{\sum_i x_i^2} - \theta_0 \right)^2 - \left( \frac{\sum_i x_i y_i}{\sum_i x_i^2} - \widehat{\theta\,} \right)^2 \tag 3 $$ (and the thing labeled "something" has canceled out, although what really matters about it is that it does not depend on $\theta$).
Now observe that the second term in line $(3)$ is zero, and the first is $\left( \widehat{\theta\,} - \theta_0\right)^2.$