We have the joint pdf
$$
f(\vec x ; \theta) = \theta^n c^{\theta n} \prod_{i=1}^n x_i^{-(\theta+1)}\mathbb{1}_{x_i \ge c}
=\mathbb{1}_{x_{(1)} \ge c} \left[ \theta^n c^{\theta n} \right] \exp \left[
-(\theta+1) \sum_{i=1}^n \ln x_i\right]
$$
and so by the Exponential-Family factorization $\sum_{i=1}^n \ln X_i$ is complete & sufficient for the distribution.
For a preliminary result, consider $Y = \ln(X) - \ln(c)$ where $X$ follows from the given pareto distribution i.e. $f_X(x) = \theta c^\theta x^{-(\theta+1)} \mathbb{1}_{x \ge c}$. Then, since $X = ce^Y$, we get
$$
f_Y(y) = f_X(ce^Y) \left\vert \frac{dx}{dy} \right\vert
= \theta c^\theta (c e^Y)^{-(\theta+1)} \mathbb{1}_{ ce^Y \ge c} \cdot ce^Y = \theta e^{-y \theta} \mathbb{1}_{ y \ge 0}
$$
which is the pdf of an exponential rate = $\theta$ distribution. Define $Y_i := \ln(X_i) - \ln(c)$
It follows that $\sum_{i=1}^n Y_i = \sum_{i=1}^n (\ln X_i - \ln (c))$ follows a $\Gamma(n,\theta)$ distribution since it's the sum of $n$ independent exponential rate $\theta$ random variables. Note that the mean of an exponential rate $\theta$ r.v. is $1/\theta$ and the mean of a $\Gamma(n,\theta)$ r.v. is $n/\theta$.
So $\frac{1}{n} \sum_{i=1}^n Y_i$ is an unbiased estimator of $1/\theta$, and it's natural to guess that $1/ \left( \frac{1}{n} \sum_{i=1}^n Y_i \right)$ is an unbiased estimator of $\theta$.
Let $Z \sim \Gamma(n,\theta)$. Then, the expecation of $1/ \left( \frac{1}{n} \sum_{i=1}^n Y_i \right)$ equals:
\begin{align*}
E \left[ \frac{n}{Z} \right] &= n \int_0^\infty \frac{1}{z} \frac{1}{\Gamma(n)} \theta^n z^{n-1} e^{- \theta z} \; dz \\
&= n \int_0^\infty \frac{1}{\Gamma(n)} \theta^n z^{n-2} e^{- \theta z} \; dz \\
&= n \frac{ \theta \Gamma(n-1)}{\Gamma(n)} \int_0^\infty \frac{1}{\Gamma(n-1)} \theta^{n-1} z^{n-2} e^{- \theta z} \; dz
\end{align*}
and this equals $\theta n \dfrac{ (n-2)!}{(n-1)!}= \frac{n}{n-1} \theta$ since the rightmost integral is integrating the pdf of a $\Gamma(n-1,\theta)$ random variable over its support.
It follows from Lehmann Scheffe that $\dfrac{n-1}{n} \cdot \dfrac{1}{\frac{1}{n} \sum_{i=1}^n Y_i} = \dfrac{n-1}{\sum_{i=1}^n (\ln X_i - \ln c) }$ is the UMVUE of $\theta$.
You have a $U(-\theta,\theta)$ population where $\theta\in\mathbb R^+$.
Joint density of the sample $\mathbf X=(X_1,X_2,\ldots,X_n)$ is
\begin{align}
f_{\theta}(\mathbf x)&=\frac{1}{(2\theta)^n}\mathbf1_{-\theta < x_1, \ldots, x_n < \theta}
\\&=\frac{1}{(2\theta)^n}\mathbf1_{0<|x_1|,\ldots,|x_n|<\theta}
\\&=\frac{1}{(2\theta)^n}\mathbf1_{\max_{1\le i\le n}|x_i|<\theta}
\end{align}
It is clear from Factorization theorem that a sufficient statistic for $\theta$ is $$T(\mathbf X)=\max_{1\le i\le n}|X_i|$$
One could verify that $|X_i|\sim U(0,\theta)$, so that the density of $T$ is $$g_{\theta}(t)=\frac{n}{\theta^n}t^{n-1}\mathbf1_{0<t<\theta}$$
That $T$ is a complete statistic for $\theta$ is well-known.
We simply have to find unbiased estimators of the parametric functions of $\theta$ based on the complete sufficient statistic. This would give us the UMVUE by the Lehmann-Scheffe theorem.
As the support of the complete sufficient statistic here depends on the parameter $\theta$, unbiased estimators can be directly obtained through differentiation.
Let $h_1(T)$ and $h_2(T)$ be unbiased estimators of $\theta/(1+\theta)$ and $e^{\theta}/\theta$ respectively, based on the complete sufficient statistic $T$.
That is, for all $\theta>0$,
\begin{align}
\qquad\quad\frac{n}{\theta^n}\int_0^{\theta}h_1(t)t^{n-1}\,dt&=\frac{\theta}{1+\theta}
\\\implies \int_0^{\theta}h_1(t)t^{n-1}\,dt &= \frac{\theta^{n+1}}{n(1+\theta)}
\end{align}
Differentiating both sides wrt $\theta$,
\begin{align}
h_1(\theta)\theta^{n-1}&=\frac{\theta^n(n\theta+n+1)}{n(1+\theta)^2}
\\\implies h_1(\theta) &=\frac{\theta(n\theta+n+1)}{n(1+\theta)^2}
\end{align}
Hence, $$h_1(T)=\frac{T(nT+n+1)}{n(1+T)^2}$$
Similarly for the second problem, for all $\theta>0$,
\begin{align}
\qquad\quad\frac{n}{\theta^n}\int_0^{\theta}h_2(t)t^{n-1}\,dt&=\frac{e^\theta}{\theta}
\\\implies \int_0^{\theta}h_2(t)t^{n-1}\,dt &= \frac{\theta^{n-1} e^\theta}{n}
\end{align}
Differentiating both sides wrt $\theta$ yields
\begin{align}
h_2(\theta)\theta^{n-1}&=\frac{e^{\theta}\theta^{n-2}(\theta+n-1)}{n}
\\\implies h_2(\theta) &=\frac{e^{\theta}(\theta+n-1)}{n\theta}
\end{align}
So, $$h_2(T)=\frac{e^{T}(T+n-1)}{nT}$$
In my initial answer, the following calculation for the UMVUE was rather unnecessary and complicated. Had the support not depended on the parameter, I might have tried this. I am keeping this part in the answer as I might be able to salvage the somewhat faulty argument on some further consideration :
For $k> -n$, we have
\begin{align}
E_\theta(T^k)&=\frac{n}{\theta^n}\int_0^\theta t^{k+n-1}\,dt\\[8pt]
& = \frac{n\theta^k}{n+k}
\end{align}
This suggests that an unbiased estimator of $\theta^k$ based on $T$ is $$\left(\frac{n+k}{n}\right)T^k$$
For the first problem, one could write
\begin{align}
\frac{\theta}{1+\theta}&=
\begin{cases}\left(1+\frac{1}{\theta}\right)^{-1}=1-\frac{1}{\theta}+\frac{1}{\theta^2}-\frac{1}{\theta^3}+\cdots&,\text{ if }\theta>1\\\\\theta(1+\theta+\theta^2+\cdots)&,\text{ if }0<\theta<1\end{cases}
\end{align}
For $0<\theta<1$, we have
$$E_{\theta}\left[\left(\frac{n+1}{n}\right)T+\left(\frac{n+2}{n}\right)T^2+\cdots\right]=\theta+\theta^2+\cdots$$
Or, $$E_{\theta}\left[\sum_{k=1}^\infty\left(\frac{n+k}{n}\right)T^k\right]=\frac{\theta}{1+\theta}$$
For $\theta>1$,
$$E_{\theta}\left[1-\left(\frac{n-1}{n}\right)\frac{1}{T}+\left(\frac{n-2}{n}\right)\frac{1}{T^2}-\cdots\right]=1-\frac{1}{\theta}+\frac{1}{\theta^2}-\cdots$$
That is, $$E_{\theta}\left[\sum_{k=0}^\infty\left(\frac{n-k}{n}\right)\frac{(-1)^k}{T^k}\right]=\frac{\theta}{1+\theta}$$
Hence by Lehmann-Scheffe theorem, UMVUE of $\theta/(1+\theta)$ is
\begin{align}
h_1(T)&=\begin{cases}\displaystyle\sum_{k=1}^\infty\left(\frac{n+k}{n}\right)T^k&,\text{ if }0<\theta<1\\\\\displaystyle\sum_{k=0}^\infty\left(\frac{n-k}{n}\right)\frac{(-1)^k}{T^k}&,\text{ if }\theta\ge1 \end{cases}
\\\\&=\begin{cases}\displaystyle\frac{T(n+1-nT)}{n(T-1)^2}&,\text{ if }0<\theta<1\\\\\displaystyle\frac{T(n+1+nT)}{n(T+1)^2}&,\text{ if }\theta\ge1\end{cases}
\end{align}
However, upon verification of unbiasedness for some values of $n$, it looks like only $$h_1(T)=\displaystyle\frac{T(n+1+nT)}{n(T+1)^2}$$ should be the correct answer for all $\theta>0$. I am not quite sure why that happens.
For the second problem, we can use the power series expansion of $e^\theta$ to obtain
$$E_{\theta}\left[\sum_{k=-1}^{\infty}\left(\frac{n+k}{n}\right)\frac{T^k}{(k+1)!}\right]=\sum_{j=0}^\infty \frac{\theta^{j-1}}{j!}=\frac{e^{\theta}}{\theta}$$
So the UMVUE of $e^{\theta}/\theta$ is
\begin{align}
h_2(T)&=\sum_{k=-1}^{\infty}\left(\frac{n+k}{n}\right)\frac{T^k}{(k+1)!}
\\\\&=\frac{e^T(n-1+T)}{nT}
\end{align}
Best Answer
Find the distribution of $T=\max\left(-X_{(1)},\frac{X_{(n)}}{2}\right)$.
For $0<t<\theta$, we have \begin{align} P_{\theta}(T\le t)&=P_{\theta}\left(-t\le X_{(1)},X_{(n)}\le 2t\right) \\&=P_{\theta}\left(-t\le X_1,X_2,\ldots,X_n\le 2t\right) \\&=\left\{P_{\theta}\left(-t<X_1<2t\right)\right\}^n \\&=\left(\frac{t}{\theta}\right)^n \end{align}
So $T$ has density
$$f_T(t)=\frac{nt^{n-1}}{\theta^n}\mathbf1_{0<t<\theta}$$
In other words, $T$ is distributed exactly as $Y_{(n)}$ where $Y_1,\ldots,Y_n$ are i.i.d $U(0,\theta)$ variables.
So studying the properties of $T$ as an estimator of $\theta$ reduces to studying the properties of $Y_{(n)}$.
That $T$ is a (minimal) complete statistic is proved in detail here. And by Lehmann-Scheffe theorem, $\left(\frac{n+1}{n}\right)T$ is indeed the UMVUE of $\theta$.