[Math] Suppose that $X\sim\operatorname{Exp}(\theta)$, show that no unbiased estimator for $\theta$ exists.

parameter estimationprobability distributionsstatistical-inferencestatistics

Exercise: Suppose that $X\sim\operatorname{Exp}(\theta)$, show that no unbiased estimator for $\theta$ exists. Hint: use the fact that $X$ is a complete and sufficient statistic for $\theta$.

I have the following definitions to work with:

Definition 1: An estimator $d(X)$ is unbiased for $\theta$ if $\operatorname{E}_\theta(d(X)) = \theta.$

Definition 2: Statistic $T$ is sufficient for $\theta\in\Omega$ if the conditional distribution of $X$ given $T$ is known (does not depend on $\theta$).

Definition 3: Statistic $T$ is complete for $\theta\in\Omega$ if for any Borel function $f$ we have that $\operatorname{E}_\theta f(T) = 0$ for all $P_\theta\in\Omega$ implies that $f(T) = 0$.

What I've tried: I'm not sure how to show that no unbiased estimator exists by using completeness or sufficiency, but I think it's possible in fact use a proof by contradiction, as is done here. If $d(X)$ is an unbiased estimator we have that $$\operatorname{E}_\theta d(X) = \int d(x)\theta e^{-\theta x}dx = \theta.$$ Unfortunately I don't know how to proceed from here. I've thought of using sufficiency and completeness to show that the above equation is not possible, but I haven't succeeded.

Question: How do I show that there exists no unbiased estimator for $\theta$?

Thanks in advance!

Best Answer

So you have

$$\int d(x) e^{-\theta x} dx = 1.$$

If you are allowed to exchange derivative (w.r.t. $\theta$) and integral you obtain

$$-\int d(x) x e^{-\theta x} dx = 0,$$

so $x d(x) = 0$ (and then $d(x) = 0$ a.e.) using completeness. (Which gives the contradiction.)

This exchange is allowed for exponential families, see e.g. Lehmann/Romano (2005), Theorem 2.7.1 therein, but I do not know whether you "may" use this tool.