Yes, $\bar{X}$ is sufficient because you can write the joint density as
$$
f(x_1,\ldots,x_n) = g[\bar{x},\theta]h(x_1,\ldots,x_n).
$$
To see if $(X_1, \sum_{i=2}^n X_i)$ is sufficient, can you write the joint density as
$$
\tilde{g}[(X_1, \sum_{i=2}^n X_i),\theta]h(x_1,\ldots,x_n)?
$$
Regarding your other question proving completeness, try using linearity of the expectation operator. Is there some linear combination of the two statistics that has mean $0$, but isn't $0$ with probability $1$?
The Poisson distribution is a one-parameter exponential family distribution, with natural sufficient statistic given by the sample total $T(\mathbf{x}) = \sum_{i=1}^n x_i$. The canonical form is:
$$p(\mathbf{x}|\theta) = \exp \Big( \ln (\theta) T(\mathbf{x}) - n\theta \Big) \cdot h(\mathbf{x}) \quad \quad \quad h(\mathbf{x}) = \coprod_{i=1}^n x_i! $$
From this form it is easy to establish that $T$ is a complete sufficient statistic for the parameter $\theta$. So the Lehmann–Scheffé theorem means that for any $g(\theta)$ there is only one unbiased estimator of this quantity that is a function of $T$, and this is the is UMVUE of $g(\theta)$. One way to find this estimator (the method you are using) is via the Rao-Blackwell theorem --- start with an arbitrary unbiased estimator of $g(\theta)$ and then condition on the complete sufficient statistic to get the unique unbiased estimator that is a function of $T$.
Using Rao-Blackwell to find the UMVUE: In your case you want to find the UMVUE of:
$$g(\theta) \equiv \theta \exp (-\theta).$$
Using the initial estimator $\hat{g}_*(\mathbf{X}) \equiv \mathbb{I}(X_1=1)$ you can confirm that,
$$\mathbb{E}(\hat{g}_*(\mathbf{X})) = \mathbb{E}(\mathbb{I}(X_1=1)) = \mathbb{P}(X_1=1) = \theta \exp(-\theta) = g(\theta),$$
so this is indeed an unbiased estimator. Hence, the unique UMVUE obtained from the Rao-Blackwell technique is:
$$\begin{equation} \begin{aligned}
\hat{g}(\mathbf{X})
&\equiv \mathbb{E}(\mathbb{I}(X_1=1) | T(\mathbf{X}) = t) \\[6pt]
&= \mathbb{P}(X_1=1 | T(\mathbf{X}) = t) \\[6pt]
&= \mathbb{P} \Big( X_1=1 \Big| \sum_{i=1}^n X_i = t \Big) \\[6pt]
&= \frac{\mathbb{P} \Big( X_1=1 \Big) \mathbb{P} \Big( \sum_{i=2}^n X_i = t-1 \Big)}{\mathbb{P} \Big( \sum_{i=1}^n X_i = t \Big)} \\[6pt]
&= \frac{\text{Pois}(1| \theta) \cdot \text{Pois}(t-1| (n-1)\theta)}{\text{Pois}(t| n\theta)} \\[6pt]
&= \frac{t!}{(t-1)!} \cdot \frac{ \theta \exp(-\theta) \cdot ((n-1) \theta)^{t-1} \exp(-(n-1)\theta)}{(n \theta)^t \exp(-n\theta)} \\[6pt]
&= t \cdot \frac{ (n-1)^{t-1}}{n^t} \\[6pt]
&= \frac{t}{n} \Big( 1- \frac{1}{n} \Big)^{t-1} \\[6pt]
\end{aligned} \end{equation}$$
Your answer has a slight error where you have conflated the sample mean and the sample total, but most of your working is correct. As $n \rightarrow \infty$ we have $(1-\tfrac{1}{n})^n \rightarrow \exp(-1)$ and $t/n \rightarrow \theta$, so taking these asymptotic results together we can also confirm consistency of the estimator:
$$\hat{g}(\mathbf{X}) = \frac{t}{n} \Big[ \Big( 1- \frac{1}{n} \Big)^n \Big] ^{\frac{t}{n} - \frac{1}{n}} \rightarrow \theta [ \exp (-1) ]^\theta = \theta \exp (-\theta) = g(\theta).$$
This latter demonstration is heuristic, but it gives a nice check on the working. It is interesting here that you get an estimator that is a finite approximation to the exponential function of interest.
Best Answer
I'll sketch out an approach but I'll leave the details up to you.
You can show that $\log(1 + X) \sim \Gamma(1, \theta^{-1})$. Use this to find the distribution of your sufficient statistic.
Then you need to suppose that $E(g(T)) = 0$ for an arbitrary function $g$, i.e. $$ \int \limits_0^\infty g(t) f_T(t)dt = 0 $$
where I'm ignoring parameters. You'll need to fill those in appropriately.
Use this to make a statement about $g$ so that you can conclude that $P(g(T) = 0) = 1$ for all $\theta \in \Theta$.
Edit: Because you know the exponential family result, I'll go through this proof in more detail.
Let $Y = \log(1+X)$. Note that this is 1-1. The inverse of the transformation is $X = \exp(Y) - 1$ so the Jacobian is $e^Y$. Putting this together we have $$ f_Y(y|\theta) = f_X(e^y - 1|\theta) \times e^y $$ $$ = \frac{\theta e^y}{(1 + e^y - 1)^{\theta + 1}} \times I(0 < y < \infty) $$ $$ = \theta e^{-\theta y} =_d \Gamma(1, \theta^{-1}) $$ where I'm dropping the indicator because the support is independent of $\theta$ so it's not too important.
This means that $T \sim \Gamma(n, \theta^{-1})$ and therefore $$ E(g(T)) = \int \limits_0^\infty g(t) t^n e^{-\theta t} \frac{\theta^n}{\Gamma(n)} dt =_{set} 0 $$
$$ \implies \int \limits_0^\infty g(t) t^n e^{-\theta t} dt = 0. $$
Because $\forall t > 0$ and $\theta > 0$ $t^n e^{-\theta t} > 0$, it must be that $g(t) = 0$ almost surely (you could do this a lot more rigorously).