Let's settle this. For a Normal distribution with zero mean and variance $\sigma^2$, the complete sufficient statistic is indeed given by the sum of squares.
By the Rao-Blackwell Theorem if we can find an unbiased function of this sufficient statistic, then we have an MVUE. Furthermore, since the family of this statistic is complete, by the Lehmann-Scheffe theorem, it is also the unique MVUE for $\sigma^2$. So let's find an unbiased function of $\displaystyle{\sum_{i=1}^n X_i ^2}$ for $X\sim N\left (0,\sigma^2 \right)$.
Recall that for any distribution for which these moments exist
$$E(X^2)=\sigma^2+\mu^2$$
which in turn suggests that for this random sample
$$E \left( \sum_{i=1}^n X_i^2 \right)=n\sigma^2$$
from which it immediately follows that the unbiased estimator in question is
$$\widehat{\sigma^2}=\frac{1}{n} \sum_{i=1}^n X_i^2$$
Now, let's find the variance of this estimator. We know that for $X\sim N\left (0,\sigma^2 \right)$,
$$\frac{\sum_{i=1}^n X_i^2}{\sigma^2}\sim \chi^2(n)$$
Be very mindful of the degrees of freedom of the $\chi^2$ distribution. What would happen if we didn't know the mean and used an estimate instead? By the properties of the $\chi^2$ distribution then,
$$var\left(\frac{\sum_{i=1}^n X_i^2}{\sigma^2} \right)=2n$$
and so $var\left( \sum_{i=1}^n X_i^2 \right)=2n\sigma^4$. Thus for our modified estimator $var \left( \frac{1}{n} \sum_{i=1}^n X_i^2\right)=\frac{2\sigma^4}{n}$ which equals the Cramer-Rao bound. This should be comforting, right?
As a final remark, I would like to point out that the Cramer-Rao bound is only attainable if the mean of the normal distribution is known, as in this situation. If that had not been the case, then we would have to settle for an estimator that does not achieve the lower bound of variance.
Hope this clears it up a bit.
The Poisson distribution is a one-parameter exponential family distribution, with natural sufficient statistic given by the sample total $T(\mathbf{x}) = \sum_{i=1}^n x_i$. The canonical form is:
$$p(\mathbf{x}|\theta) = \exp \Big( \ln (\theta) T(\mathbf{x}) - n\theta \Big) \cdot h(\mathbf{x}) \quad \quad \quad h(\mathbf{x}) = \coprod_{i=1}^n x_i! $$
From this form it is easy to establish that $T$ is a complete sufficient statistic for the parameter $\theta$. So the Lehmann–Scheffé theorem means that for any $g(\theta)$ there is only one unbiased estimator of this quantity that is a function of $T$, and this is the is UMVUE of $g(\theta)$. One way to find this estimator (the method you are using) is via the Rao-Blackwell theorem --- start with an arbitrary unbiased estimator of $g(\theta)$ and then condition on the complete sufficient statistic to get the unique unbiased estimator that is a function of $T$.
Using Rao-Blackwell to find the UMVUE: In your case you want to find the UMVUE of:
$$g(\theta) \equiv \theta \exp (-\theta).$$
Using the initial estimator $\hat{g}_*(\mathbf{X}) \equiv \mathbb{I}(X_1=1)$ you can confirm that,
$$\mathbb{E}(\hat{g}_*(\mathbf{X})) = \mathbb{E}(\mathbb{I}(X_1=1)) = \mathbb{P}(X_1=1) = \theta \exp(-\theta) = g(\theta),$$
so this is indeed an unbiased estimator. Hence, the unique UMVUE obtained from the Rao-Blackwell technique is:
$$\begin{equation} \begin{aligned}
\hat{g}(\mathbf{X})
&\equiv \mathbb{E}(\mathbb{I}(X_1=1) | T(\mathbf{X}) = t) \\[6pt]
&= \mathbb{P}(X_1=1 | T(\mathbf{X}) = t) \\[6pt]
&= \mathbb{P} \Big( X_1=1 \Big| \sum_{i=1}^n X_i = t \Big) \\[6pt]
&= \frac{\mathbb{P} \Big( X_1=1 \Big) \mathbb{P} \Big( \sum_{i=2}^n X_i = t-1 \Big)}{\mathbb{P} \Big( \sum_{i=1}^n X_i = t \Big)} \\[6pt]
&= \frac{\text{Pois}(1| \theta) \cdot \text{Pois}(t-1| (n-1)\theta)}{\text{Pois}(t| n\theta)} \\[6pt]
&= \frac{t!}{(t-1)!} \cdot \frac{ \theta \exp(-\theta) \cdot ((n-1) \theta)^{t-1} \exp(-(n-1)\theta)}{(n \theta)^t \exp(-n\theta)} \\[6pt]
&= t \cdot \frac{ (n-1)^{t-1}}{n^t} \\[6pt]
&= \frac{t}{n} \Big( 1- \frac{1}{n} \Big)^{t-1} \\[6pt]
\end{aligned} \end{equation}$$
Your answer has a slight error where you have conflated the sample mean and the sample total, but most of your working is correct. As $n \rightarrow \infty$ we have $(1-\tfrac{1}{n})^n \rightarrow \exp(-1)$ and $t/n \rightarrow \theta$, so taking these asymptotic results together we can also confirm consistency of the estimator:
$$\hat{g}(\mathbf{X}) = \frac{t}{n} \Big[ \Big( 1- \frac{1}{n} \Big)^n \Big] ^{\frac{t}{n} - \frac{1}{n}} \rightarrow \theta [ \exp (-1) ]^\theta = \theta \exp (-\theta) = g(\theta).$$
This latter demonstration is heuristic, but it gives a nice check on the working. It is interesting here that you get an estimator that is a finite approximation to the exponential function of interest.
Best Answer
I think I solved my own question. Comments about this answer and new answers are welcome.
If $x_1,\ldots,x_n$ are observations in a $N(\mu,\sigma^2)$ population and $\mu$ is unknown, then $$f(x_1,\ldots,x_n|\mu,\sigma^2)=\left(\frac{1}{\sqrt{2\pi\sigma^2}}\right)^ne^{-\frac{n\mu^2}{2\sigma^2}}e^{\frac{\mu}{\sigma^2}\sum_{i=1}^nx_i-\frac{1}{2\sigma^2}\sum_{i=1}^n x_i^2}$$ (this shows that the normal family is a exponential family). As the image of the map $$(\mu,\sigma^2)\in \mathbb{R}\times\mathbb{R}^+\mapsto (\frac{\mu}{\sigma^2},-\frac{1}{2\sigma^2})$$ contains an open set of $\mathbb{R}^2$, by a theorem (for instance, see page 6 here), the statistic $U=(\sum_{i=1}^n X_i,\sum_{i=1}^n X_i^2)$ is sufficient and complete for $(\mu,\sigma^2)$. As $T$ is a function of $U$ and is centered for $\sigma^2$, by Lehmann-Scheffé $T$ is UMVUE for $\sigma^2$.
Now, if $\mu=\mu_0$ is known, $\mu$ does not belong to the parametric space anymore, therefore the "new" density function is $$f(x_1,\ldots,x_n|\sigma^2)=\left(\frac{1}{\sqrt{2\pi\sigma^2}}\right)^ne^{-\frac{1}{2\sigma^2}\sum_{i=1}^n(x_i-\mu_0)^2}$$(we have a new exponential family). As the image of the map $$\sigma^2\in\mathbb{R}^+\mapsto -\frac{1}{2\sigma^2}$$ contains an open subset of $\mathbb{R}$, our statistic $W$ is sufficient and complete for $\sigma^2$. Since it is in addition centered, $W$ is UMVUE for $\sigma^2$ by Lehmann-Scheffé.