You have $\overline{X}$ complete & sufficient and moreover $E[ \overline{X} ] = 1/\theta$; i.e. $\overline{X}$ is the UMVUE for $1/\theta$. It seems reasonable to guess that $1/\overline{X}$ may be the UMVUE for $\theta$. Note that $\sum_{i=1}^n X_i \sim \Gamma(n,\theta)$ since each $X_i$ is exponential rate $\theta$ and they're iid. Let $Z \sim \Gamma(n,\theta)$.
\begin{align*}
E[1/\overline{X}] = n E[1/Z] &= n \int_0^\infty \dfrac{1}{z} \dfrac{\theta^n}{\Gamma(n)} z^{n-1} e^{- \theta z} \; dz \\
&= n \int_0^\infty \dfrac{\theta^n}{\Gamma(n)} z^{n-2} e^{-\theta z } \; dz \\
&= n \theta \dfrac{\Gamma(n-1)}{\Gamma(n)} \underbrace{\int_0^\infty \dfrac{\theta^{n-1}}{\Gamma(n-1)} z^{n-2} e^{-\theta z } \; dz}_{=1} \\
&= \dfrac{n \theta \Gamma(n-1)}{\Gamma(n)} = \dfrac{n \theta}{n-1}
\end{align*}
So $ \dfrac{n-1}{n} \cdot \dfrac{1}{\overline{X}} = \dfrac{n-1}{\sum_{i=1}^n X_i}$ is the UMVUE for $\theta$.
Joint density of $X_1,X_2,\ldots,X_n$ is
\begin{align}
f_{\theta}(x_1,x_2,\ldots,x_n)&=\prod_{i=1}^n f(x_i\mid\theta)
\\&=\left(\frac{2}{\theta}\right)^n \left(\prod_{i=1}^n x_i\right) \exp\left(-\frac{1}{\theta}\sum_{i=1}^n x_i^2\right)\mathbf1_{x_1,\ldots,x_n>0}\quad,\,\theta>0
\end{align}
This pdf is a member of the one-parameter exponential family.
So it follows that a complete sufficient statistic for $\theta$ is indeed
$$U(X_1,X_2,\ldots,X_n)=\sum_{i=1}^n X_i^2$$
Yes it is true that the UMVUE of $\theta$ if it exists is given by $E(T\mid U)$ where $T$ is any unbiased estimator of $\theta$. This is what the Lehmann-Scheffe theorem says. As a corollary, it also says that any unbiased estimator of $\theta$ based on a complete sufficient statistic has to be the UMVUE of $\theta$. Here this corollary comes in handy.
To make sense of the hint given, find the distribution of $Y=X^2$ where $X$ has the Rayleigh pdf you are given.
Via change of variables, the pdf of $Y$ is
\begin{align}
f_Y(y)&=f(\sqrt y\mid\theta)\left|\frac{dx}{dy}\right|\mathbf1_{y>0}
\\&=\frac{1}{\theta}e^{-y/\theta}\mathbf1_{y>0}\quad,\,\theta>0
\end{align}
In other words, $X_i^2$ are i.i.d Exponential with mean $\theta$ for each $i=1,\ldots,n$.
Or, $$\frac{2}{\theta}X_i^2\stackrel{\text{ i.i.d }}\sim\text{Exp with mean }2\equiv \chi^2_2$$
Thus implying $$\frac{2}{\theta}\sum_{i=1}^n X_i^2=\frac{2U}{\theta} \sim \chi^2_{2n}$$
So,
\begin{align}
E_{\theta}\left(\frac{2U}{\theta}\right)=2n\implies E_{\theta}\left(\frac{U}{n}\right)=\theta
\end{align}
Hence the UMVUE of $\theta$ is $$\boxed{\frac{U}{n}=\frac{1}{n}\sum_{i=1}^n X_i^2}$$
However, we did not require finding the distribution of $X_i^2$ since it is easy to show directly that $$E_{\theta}(U)=\sum_{i=1}^n \underbrace{E_{\theta}(X_i^2)}_{\theta}=n\theta$$
Best Answer
I follow Shao's approach with slight modifications and more details. We have $Y\sim e^{-(x-a)}\mathbf{1}_{(a,\infty)}(x)dx$ and $X=(X_1,...,X_n)$ is an IID random sample from the law of $Y$. The minimum $X_{(1)}$ is a complete and sufficient statistic for $a$, so the UMVUE of $P(Y\leq c)$ for $c$ fixed will have the form $f(X_{(1)})$ and $E[f(X_{(1)})]=(1-e^{-(c-a)})\mathbf{1}_{(a,\infty)}(c)$. We can find for $t\geq a$: $$P(X_{(1)}\geq t)=P(\cap_{k\leq n}\{X_k\geq t\})=e^{-n(t-a)}$$ and $1$ otherwise so $X_{(1)}\sim ne^{-n(x-a)}\mathbf{1}_{(a,\infty)}(x)dx$. So for $c> a$ $$E[f(X_{(1)})]=n\int_{(a,\infty)}f(x)e^{-n(x-a)}dx=1-e^{-(c-a)}$$ We solve the integral equation by differentiating wrt $a$: $$-nf(a)+n\cdot \underbrace{n\int_{(a,\infty)}f(x)e^{-n(x-a)}dx}_{=1-e^{-(c-a)}}=-e^{-(c-a)}$$ so $f(a)=1-e^{-(c-a)}+n^{-1}e^{-(c-a)}=1-e^{-(c-a)}(1-n^{-1})$ and therefore $$ f(x)=\begin{cases} 1-e^{-(c-x)}(1-n^{-1})&c> x\\ 0&c\leq x \end{cases}$$ gives the UMVUE. Indeed, we can check again the calculations. We have for $c>a$: $$\begin{aligned}E[f(X_{(1)})]&=n\int_{(a,\infty)}f(x)e^{-n(x-a)}dx\\ &=(1-e^{-n(c-a)})-(n-1)e^{na-c}\int_{(a,c)}e^{-x(n-1)}dx\\ &=1-e^{-(c-a)}\\ &=P(Y\leq c) \end{aligned}$$ while for $c\leq a$: $E[f(X_{(1)})]=0=P(Y\leq c)$. So we conclude.