I think you complicated the computation when you went through the expected value. You already got a complete sufficient statistic (you could have avoid the whole check of minimal sufficiency and completeness simply using the fact that the Poisson distribution belongs to the exponential family of parameter one which is also full rank, so $\sum_{i=1}^n X_i $ is such a statistic).
Then if you consider properly Lehmann-Scheffè theorem, it allows you to choose between two different but equivalent methods to find an UMVUE:
-Using Rao-Blackwell; ingredients: an unbiased estimator $U$ (even trivial) for your parameter $g(\theta)$; a sufficient and complete statistic $T$ for $\theta$. Then the UMVUE is $W=\mathrm E[U|T]$;
-Direct application of Lehman-Scheffè: ingredients: a statistic $W$ that is both unbiased for $g(\theta)$ and a function of $T$, a sufficient and complete statistic for $\theta$.
I'd suggest to use this second method to find an UMVUE for $\theta$, indeed you already have $T$, so the first thing to do is check if it is unbiased for $\theta$ or not. In this case, using the well known distribution of the sum of Poisson random variables, you get that $\mathrm E[T]=\frac {\theta}{n}$. So you can correct your estimator, exploiting the fact that the expected value is a linear functional. Hence $W=Tn$ is your UMVUE, indeed if we check its properties: it is a function of $T$; it is unbiased for $\theta$, indeed
$\mathrm E[W]=\mathrm E[nT]=n\mathrm E[T]=\theta$.
The idea is that you can start with any estimator of $(1-\theta)^2$, no matter how awful, provided it is unbiased. The Rao-Blackwell process will almost magically turn it into a uniformly minimum-variance unbiased estimator (UMVUE).
There are many ways to proceed. One fruitful idea is systematically to remove the complications in the expression "$(1-\theta)^2$". This leads to a sequence of questions:
How to find an unbiased estimator of $(1-\theta)^2$?
How to find an unbiased estimator of $1-\theta$?
How to find an unbiased estimator of $\theta$?
The answer to (3), at least, should be obvious: any of the $X_i$ will be an unbiased estimator because
$$\mathbb{E}_\theta(X_i) = \theta.$$
(It doesn't matter how you come up with this estimator: by guessing and checking (which often works), Maximum Likelihood, or whatever. ML, incidentally, often is not helpful because it tends to produce biased estimators. What is helpful is the extreme simplicity of working with a single observation rather than a whole bunch of them.)
Linearity of expectation tells us an answer to (2) would be any of the $1 - X_i$, because
$$\mathbb{E}_\theta(1-X_i) = 1 - \mathbb{E}_\theta(X_i) = 1-\theta.$$
Getting from this to an answer to (1) is the crux of the matter. At some point you will need to exploit the fact you have more than one independent realization of this Bernoulli variable, because it quickly becomes obvious that a single $0-1$ observation just can't tell you much. For instance, the square of $1-X_1$ won't work, because (since $(1-X_1)^2 = (1-X_1)$)
$$\mathbb{E}_\theta((1-X_1)^2) = 1-\theta.$$
What could be done with two of the observations, such as $X_1$ and $X_2$? A little thought might eventually suggest considering their product. Sure enough, because $X_1$ and $X_2$ are independent, their expectations multiply:
$$\mathbb{E}_\theta((1-X_1)(1-X_2)) = \mathbb{E}_\theta((1-X_1))\mathbb{E}_\theta((1-X_2)) = (1-\theta)(1-\theta)=(1-\theta)^2.$$
You're now good to go: apply the Rao-Blackwell process to the unbiased estimator $T=(1-X_1)(1-X_2)$ to obtain an UMVUE. (That is, find its expectation conditional on $\sum X_i$.) I'll stop here so that you can have the fun of discovering the answer for yourself: it's marvelous to see what kinds of formulas can emerge from this process.
To illustrate the calculation let's take the simpler case of three, rather than four, $X_i$. The sum $S=X_1+X_2+X_3$ counts how many of the $X_i$ equal $1$. Look at the four possibilities:
When $S=0$, all the $X_i=0$ and $T = 1$ constantly, whence $\mathbb{E}(T\,|\,S=0)=1$.
When $S=1$, there are three possible configurations of the $X_i$: $(1,0,0)$, $(0,1,0)$, and $(0,0,1)$. All are equally likely, giving each a chance of $1/3$. The value of $T$ is $0$ for the first two and $1$ for the last. Therefore
$$\mathbb{E}(T\,|\,S=1) = \left(\frac{1}{3}\right)\left(0\right)+\left(\frac{1}{3}\right)\left(0\right)+\left(\frac{1}{3}\right)\left(1\right) = \frac{1}{3}.$$
When $S=2$ or $S=3$, $T=0$ no matter what order the $X_i$ appear in, giving $0$ for the conditional expectation.
The Rao-Blackwellized version of $T$, then, is the estimator that associates with the sum $S$ the following guesses for $\theta$:
$$\tilde T(0)=1,\ \tilde T(1)=1/3,\ \tilde T(2)=\tilde T(3)=0.$$
As a check, the expectation of $\tilde T$ can be computed as
$$\eqalign{
\mathbb{E}(\tilde T) &= \Pr(S=0)\tilde{T}(0) + \Pr(S=1)\tilde{T}(1) + \Pr(S=2)\tilde{T}(2) + \Pr(S=3)\tilde{T}(3) \\
&= (1-\theta)^3 + \binom{3}{1}\theta(1-\theta)^2\left(1/3\right) + 0 + 0 \\
&= 1 - 3 \theta + 3 \theta^2 - \theta^3 + 3(1/3)(\theta - 2\theta^2 + \theta^2) \\
&= 1 - 2\theta + \theta^2 \\
&=(1-\theta)^2,
}$$
showing it is unbiased. A similar calculation will obtain its variance (which is useful to know, since it is supposed to be the smallest possible variance among unbiased estimators).
Note that these calculations required little more than applying the definition of expectation and computing binomial probabilities.
Best Answer
Yes, $f(x\mid\theta)$ is an even function of $x$, but how can the pdf 'change' completely in your calculations? You carried out calculations for an exponential density whereas you are given a Laplace distribution.
Indeed, the sample is drawn from a Laplace distribution with scale parameter $1/\theta$ and location parameter zero. We can conclude that the joint density $f_{\theta}$ belongs to the one-parameter exponential family. Using this, we can find a complete sufficient statistic for $\theta$.
Due to independence, joint density of $\mathbf X=(X_1,X_2,\cdots,X_n)$ is
\begin{align} f_{\theta}(\mathbf x)&=\prod_{i=1}^n\frac{\theta}{2}e^{-\theta\,|x_i|} \\&=\left(\frac{\theta}{2}\right)^ne^{-\theta\sum_{i=1}^n|x_i|} \\&=\exp\left[-\theta\sum_{i=1}^n|x_i|+n\ln\left(\frac{\theta}{2}\right)\right]\quad,\,\mathbf x\in\mathbb R^n\,,\,\theta>0 \end{align}
Clearly, a complete sufficient statistic for the family of distributions $\{f_{\theta}:\theta>0\}$ is $$T(\mathbf X)=\sum_{i=1}^n|X_i|$$
By Lehmann-Scheffe theorem, an unbiased estimator of $\theta$ based on $T$ will be the UMVUE of $\theta$.
It is a simple exercise to show that $|X_i|\sim\mathcal{Exp}(\theta)$ independently for each $i$, where $\theta$ denotes rate of the distribution. As such, we have $T\sim\mathcal{Ga}(\theta,n)$ with density $$g(t)=\frac{\theta^n e^{-\theta t}t^{n-1}}{\Gamma(n)}\mathbf1_{t>0}$$
We find that
\begin{align} E\left(\frac{1}{T}\right)&=\int_0^{\infty}\frac{1}{t}\frac{\theta^n e^{-\theta t}t^{n-1}}{\Gamma(n)}\,dt \\&=\frac{\theta^n}{\Gamma(n)}\frac{\Gamma(n-1)}{\theta^{n-1}} \\&=\frac{\theta}{n-1} \end{align}
So the UMVUE of $\theta$ is $$\frac{n-1}{T}=\frac{n-1}{\sum_{i=1}^n|X_i|}$$