Find Uniform Minimum Variance Unbiased estimator (UMVU) using Lehmann Scheffé – showing statistic is complete

exponential distributionparameter estimationstatistical-inferencestatistics

Let $X_1,…,X_n$ be independent copies of a real-valued random variable $X$ where $X$ has Lebesgue density

\begin{align*}
p_\theta(x) = \begin{cases} \exp(\theta-x),\quad x>\theta \\
0, \quad\quad\quad\quad\;\ x\leq \theta, \end{cases}
\end{align*}

where $\theta\in \mathbb{R}$ is an unknown parameter.
Let $S:=\min(X_1,…,X_n)$.

Find the Uniform Minimum Variance Unbiased (UMVU) estimator of $\theta$.

I already know that $S$ is sufficient for $\theta$ and that $T:=S-1/n$ is an unbiased estimator of $\theta.$ My idea is to apply the Lehmann-Scheffé thm. since then the UMVU is given by

\begin{align*}
\mathbb{E}[T|S]=\mathbb{E}[S-1/n|S]=S-1/n.
\end{align*}

Is this the correct approach?
If yes, for applying Lehmann-Scheffé, I would also need that S is a complete statistic. How do I show this properly?

Edit: I tried to show completeness by definition, i.e. I setup the equation $\mathbb{E}_\theta[g(S)]=0 \;\forall \theta$ for some function $g$ and now want to show that $g(S)=0 \; \mathbb{P}_\theta$-a.s. for all $\theta$.
Since the $X_i$ are iid it is easy to see that the cdf is $F_S(x)=1-(1-P_\theta(x))^n$, where $P_\theta(x)$ is the cdf of $X_i$. By taking the derivative we get the pdf for $S$: $f_S(x)=n\cdot p_\theta(x)(1-P_\theta (x))^{n-1}$. $P_\theta (x)$ can be easily calculated and we get
\begin{align*}
f_S(x)=n\cdot e^{n(\theta-x)}.
\end{align*}

Hence, $\mathbb{E}_\theta[g(S)]=\int_\theta^\infty g(x)ne^{n(\theta-x)}dx$ has to be $0$.

Is it now enough to say that $g(S)=0 \; \mathbb{P}_\theta$-a.s. for all $\theta$, since the exponential function is always positive? Or is there a more rigorous way to show it?

Best Answer

For some measurable function $g$, suppose

$$\mathbb E_{\theta}\left[g(S)\right]=\int_{\theta}^\infty g(x)ne^{-n(x-\theta)}\,dx=0\quad\,\forall\,\theta\in\mathbb R$$

That is, $$\int_{\theta}^\infty g(x)e^{-nx}\,dx=0\quad\forall\,\theta$$

Now for some $a\in(\theta,\infty)$, we can rewrite the last equation as

$$\int_{\theta}^a g(x)e^{-nx}\,dx+\int_a^\infty g(x)e^{-nx}\,dx=0\quad\forall\,\theta$$

Differentiating both sides of the last equation with respect to $\theta$, we get

$$g(\theta)e^{-n\theta}=0\quad\forall\,\theta$$

Now that $e^{-n\theta}>0$ for each $\theta$, you can conclude that $g$ is exactly zero almost everywhere.

This perhaps is a more convincing argument.