Before trying to find a UMP test, one needs to first check if there exists one. To do this one needs to find the likelihood ratio function
$$l(x)=f_{\theta_1}(x)/f_{\theta_0}(x)$$
This function must be monotone non-decreasing in $x$ for every $\theta_1\geq \theta_0$. In the given question $\theta_1=2$, and the density function is $$f_{2}(x)=2x.$$ Similarly for $\theta_0\in[1/2,1]$, $$f_{\theta_0}(x)=\theta_0x^{\theta_0-1}$$ Hence, the likelihood ratio function is
$$l_{\theta_0}(x)=\frac{2x}{\theta_0x^{\theta_0-1}}=\frac{2}{\theta_0}x^{2-\theta_0}$$
Since this function is increasing in $x$ for all $\theta_0\in[1/2,1]$, there exists a UMP test of level $\alpha$.
By definition of UMP test, the significance level $\alpha$ is the expected value of the decision rule (which is the likelihood ratio test with a certain threshold $\lambda$), for which the false alarm probability lies below $\alpha$, for every $\theta_0$
$$\alpha=\sup_{\theta_0}\int_{\{x:l_{\theta_0}(x)>\lambda\}}f_{\theta_0}(x)\mathrm{d}x=\sup_{\theta_0}\int_{\{x:l_{\theta_0}(x)>\lambda\}}\theta_0x^{\theta_0-1}\mathrm{d}x$$
Now, we have a nice simplification (Why?) $${\{x:l_{\theta_0}(x)>\lambda\}}\equiv {\{x:x>\lambda^{'}\}}$$
Hence
$$\alpha=\sup_{\theta_0}\int_{\{x:l_{\theta_0}(x)>\lambda\}}\theta_0x^{\theta_0-1}\mathrm{d}x=\sup_{\theta_0}\int_{\lambda^{'}}^1\theta_0x^{\theta_0-1}\mathrm{d}x=\sup_{\theta_0}1-{\lambda^{'}}^{\theta_0}=0.05$$
It is known that $\lambda^{'}\in[0,1]$ and $\theta_0\in[1/2,1]$. Now what value of $\theta_0$ maximizes $1-{\lambda^{'}}^{\theta_0}$ or similarly minimizes ${\lambda^{'}}^{\theta_0}$?
The UMP test is then $$\phi(x)=\begin{cases}1,\quad x>\lambda^{'}\\0,\quad x\leq \lambda^{'}\end{cases}$$
Joint density of the sample $(X_1,X_2,\ldots,X_n)$ is
$$f_{\theta}(x_1,\ldots,x_n)=\exp\left(-\sum_{i=1}^n(x_i-\theta)\right)\mathbf1_{x_{(1)}>\theta}\quad,\,\theta>0$$
By N-P lemma, a most powerful test of size $\alpha$ for testing $H_0:\theta=\theta_0$ against $H_1:\theta=\theta_1(>\theta_0)$ is given by $$\varphi(x_1,\ldots,x_n)=\begin{cases}1&,\text{ if }\lambda(x_1,\ldots,x_n)>k\\0&,\text{ if }\lambda(x_1,\ldots,x_n)<k\end{cases}$$
, where $$\lambda(x_1,\ldots,x_n)=\frac{f_{\theta_1}(x_1,\ldots,x_n)}{f_{\theta_0}(x_1,\ldots,x_n)}$$
and $k(>0)$ is such that $$E_{\theta_0}\varphi(X_1,\ldots,X_n)=\alpha$$
Now,
\begin{align}
\lambda(x_1,\ldots,x_n)&=\frac{\exp\left(-\sum_{i=1}^n(x_i-\theta_1)\right)\mathbf1_{x_{(1)}>\theta_1}}{\exp\left(-\sum_{i=1}^n(x_i-\theta_0)\right)\mathbf1_{x_{(1)}>\theta_0}}
\\\\&=e^{n(\theta_1-\theta_0)}\frac{\mathbf1_{x_{(1)}>\theta_1}}{\mathbf1_{x_{(1)}>\theta_0}}
\\\\&=\begin{cases}e^{n(\theta_1-\theta_0)}&,\text{ if }x_{(1)}>\theta_1\\0&,\text{ if }\theta_0<x_{(1)}\le \theta_1\end{cases}
\end{align}
So $\lambda(x_1,\ldots,x_n)$ is a monotone non-decreasing function of $x_{(1)}$, which means
$$\lambda(x_1,\ldots,x_n)\gtrless k \iff x_{(1)}\gtrless c$$, for some $c$ such that $$E_{\theta_0}\varphi(X_1,\ldots,X_n)=\alpha$$
We thus have
$$\varphi(x_1,\ldots,x_n)=\begin{cases}1&,\text{ if }x_{(1)}>c\\0&,\text{ if }x_{(1)}<c\end{cases}$$
Again,
\begin{align}
E_{\theta_0}\varphi(X_1,\ldots,X_n)&=P_{\theta_0}(X_{(1)}>c)
\\&=\left(P_{\theta_0}(X_1>c)\right)^n
\\&=e^{n(\theta_0-c)}\quad,\,c>\theta_0
\end{align}
So from the size condition we get $$c=\theta_0-\frac{\ln\alpha}{n}$$
Finally, the test function is
$$\varphi(x_1,\ldots,x_n)=\begin{cases}1&,\text{ if }x_{(1)}>\theta_0-\frac{\ln\alpha}{n}\\0&,\text{ if }x_{(1)}<\theta_0-\frac{\ln\alpha}{n}\end{cases}$$
Best Answer
We have the distribution of a single observation $X$ :
\begin{align} f_{\theta}(x)&=\frac{1}{\theta}\mathbf1_{x\in\{1,2,\ldots,\theta\}}\quad,\,\theta\in\{20,40\} \end{align}
By NP lemma, an MP test of level $\alpha$ for testing $H_0:\theta=40$ against $H_1:\theta=20$ is of the form
\begin{align} \varphi(x)&=\begin{cases}1&,\text{ if }\lambda(x)>k\\\gamma&,\text{ if }\lambda(x)=k\\0&,\text{ if }\lambda(x)<k\end{cases} \end{align}
, where $$\lambda(x)=\frac{f_{H_1}(x)}{f_{H_0}(x)}$$
and $\gamma\in[0,1]$ and $k(> 0)$ are so chosen that $$E_{H_0}\,\varphi(X)\leqslant 0.1$$
Now,
\begin{align} \lambda(x)&=2\frac{\mathbf1_{x\in\{1,2,\ldots,20\}}}{\mathbf1_{x\in\{1,2,\ldots,40\}}} \\\\&=\begin{cases}2&,\text{ if }x=1,2,\ldots,20 \\0&,\text{ if }x=21,22,\ldots,40 \end{cases} \end{align}
Therefore, for some $c$, $$\lambda(x)\gtrless k\implies x\lessgtr c$$
And the level restriction gives $$P_{H_0}(X<c)+\gamma P_{H_0}(X=c)\leqslant 0.1\tag{1}$$
Taking different values of $c$ (namely $c=2,3,4,5$) and finding the corresponding tail probability $P_{H_0}(X<c)$ subject to $(1)$, I end up with $$c=4\quad,\quad \gamma=1$$
This is UMP because $\varphi$ obviously does not depend on the value of $\theta$ under $H_1$.