I don't think you can. At least if X is distributed on $(\theta,\theta + 1)$, then $(X_{(1)}, X_{(n)})$ is sufficient, but not complete.
Lets say $g(X_{(1)}, X_{(n)})$ is a function of your statistic. Then
$\mathbb{E}_{\theta}[X_{(n)} - X_{(1)}] = \mathbb{E}_{\theta}[X_{(n)}] - \mathbb{E}_{\theta}[ X_{(1)}] = (\theta + n/(n+1)) - (\theta + 1/(n+1)) = (n-1)/(n+1)$
Then $g$ defined as
$g(X_{(1)}, X_{(n)}) := X_{(n)} - X_{(1)} - (n-1)/(n+1)$
would be 0 $\forall \theta$ and thus not complete.
As I said, the setting is slightly different though. Let me know if you find a definite solution - I'm looking at the same problem.
We have the joint pdf
$$
f(\vec x ; \theta) = \theta^n c^{\theta n} \prod_{i=1}^n x_i^{-(\theta+1)}\mathbb{1}_{x_i \ge c}
=\mathbb{1}_{x_{(1)} \ge c} \left[ \theta^n c^{\theta n} \right] \exp \left[
-(\theta+1) \sum_{i=1}^n \ln x_i\right]
$$
and so by the Exponential-Family factorization $\sum_{i=1}^n \ln X_i$ is complete & sufficient for the distribution.
For a preliminary result, consider $Y = \ln(X) - \ln(c)$ where $X$ follows from the given pareto distribution i.e. $f_X(x) = \theta c^\theta x^{-(\theta+1)} \mathbb{1}_{x \ge c}$. Then, since $X = ce^Y$, we get
$$
f_Y(y) = f_X(ce^Y) \left\vert \frac{dx}{dy} \right\vert
= \theta c^\theta (c e^Y)^{-(\theta+1)} \mathbb{1}_{ ce^Y \ge c} \cdot ce^Y = \theta e^{-y \theta} \mathbb{1}_{ y \ge 0}
$$
which is the pdf of an exponential rate = $\theta$ distribution. Define $Y_i := \ln(X_i) - \ln(c)$
It follows that $\sum_{i=1}^n Y_i = \sum_{i=1}^n (\ln X_i - \ln (c))$ follows a $\Gamma(n,\theta)$ distribution since it's the sum of $n$ independent exponential rate $\theta$ random variables. Note that the mean of an exponential rate $\theta$ r.v. is $1/\theta$ and the mean of a $\Gamma(n,\theta)$ r.v. is $n/\theta$.
So $\frac{1}{n} \sum_{i=1}^n Y_i$ is an unbiased estimator of $1/\theta$, and it's natural to guess that $1/ \left( \frac{1}{n} \sum_{i=1}^n Y_i \right)$ is an unbiased estimator of $\theta$.
Let $Z \sim \Gamma(n,\theta)$. Then, the expecation of $1/ \left( \frac{1}{n} \sum_{i=1}^n Y_i \right)$ equals:
\begin{align*}
E \left[ \frac{n}{Z} \right] &= n \int_0^\infty \frac{1}{z} \frac{1}{\Gamma(n)} \theta^n z^{n-1} e^{- \theta z} \; dz \\
&= n \int_0^\infty \frac{1}{\Gamma(n)} \theta^n z^{n-2} e^{- \theta z} \; dz \\
&= n \frac{ \theta \Gamma(n-1)}{\Gamma(n)} \int_0^\infty \frac{1}{\Gamma(n-1)} \theta^{n-1} z^{n-2} e^{- \theta z} \; dz
\end{align*}
and this equals $\theta n \dfrac{ (n-2)!}{(n-1)!}= \frac{n}{n-1} \theta$ since the rightmost integral is integrating the pdf of a $\Gamma(n-1,\theta)$ random variable over its support.
It follows from Lehmann Scheffe that $\dfrac{n-1}{n} \cdot \dfrac{1}{\frac{1}{n} \sum_{i=1}^n Y_i} = \dfrac{n-1}{\sum_{i=1}^n (\ln X_i - \ln c) }$ is the UMVUE of $\theta$.
Best Answer
Well, if you find a statistic $T=t(S)$ that is a function of $S=\sum_{i=1}^{n} X_i$ and which is an unbiased estimator of $\lambda$, you've got your UMVUE by Lehmann-Scheffe.
For $T$ to be an unbiased estimator we want: $E[(t(S))]=\lambda$, of course we know that for a $X\tilde{} exp(\lambda)$ we already have $E[X]=\lambda$ and in general $E[\bar X]=E[X]$.
So, we see that $\bar X$ is an unbiased estimator of $\lambda$ and because $\bar X$ is an one-to-one function of $S$ (by $\bar X=S/n$), we're done.