Is the MLE of an exponential distribution complete and sufficient

maximum likelihoodparameter estimationprobabilitystatistical-inferencestatistics

I've just prove that the MLE of an exponential distribution is:

$$
\hat \theta = \frac{n}{\sum x_i}
$$

So I'm trying to show that $\hat \theta $ is complete and sufficient.

For sufficiency I've tried to apply the Fisher–Neyman factorization theorem but I can't find the function $g_\theta(T(x))$ which makes me suspect that it might not be sufficient.

For completeness I know I have to prove that $E_\theta[g(\hat \theta)] = 0$ $\forall \theta \iff g(\theta) = 0 $ $ \forall \theta $ but the sum in $\hat \theta$ is really annoying.

Any hints or suggestions?
This is not homework. I'm studying by my own.

Best Answer

The density for a sample from the exponential distribution is $f(\textbf x; \theta)=\theta^ne^{-\theta\sum_i x_i}$. From the Fisher-Neyman factorization theorem, $\sum_i x_i$ is a sufficient statistic for $\theta$. There is a fact that any one-to-one transformation of a sufficient statistic is also sufficient. So $\frac n{\sum_i x_i}$ is sufficient as well. The exponential distribution is from the exponential family of distributions. For one-dimensional exponential families, if the parameter space contains an open interval in $\mathbb R$, then the sufficient statistic is complete. Here $\theta\in(0,\infty)$, an open set, so the statistic is complete. So the MLE is a complete, sufficient statistic.

Here is how you would prove completeness directly from the definition. $T(x)=\sum_i x_i$ is gamma-distributed so it has density $f(t)=\frac{\theta^n}{\Gamma(n)}t^{n-1}e^{-\theta t}$, and so $E(g(T))=\int_0^\infty g(t)t^{n-1}e^{-\theta t}dt$. This is called a Laplace transform and it equals zero only if $g(t)t^{n-1}=0$, so $g(t)=0$. Thus, the sufficient statistic is complete.

Related Question