Unbiased estimator of $e^{-\theta}$

estimationparameter estimation

$(X_1, X_2,…,X_n)$ follows i.i.d. $N(\theta,1)$. Is $g(\theta)=e^{-\theta}$ estimable?

I have to show that there exists an unbiased estimator of $g(\theta)$. Now, $T=\overline{X}$ is unbiased for $\theta$. Then is it true that $e^{-\overline{X}}$ is unbiased for $g(\theta)$, since $f(x)=e^{-x}$ is a bijective function of $x$?

Any help is appreciated. Thanks in advance.

Best Answer

Please note that if $T$ is an unbiased estimator of $\theta$, then $g(T)$ is not necessarily an unbiased estimator of $g(\theta)$, even if $g$ is bijective. Let's check if your estimator $e^{-\bar X}$ is a biased estimator of $e^{-\theta}$ or not.

\begin{align} E\left[e^{-\bar X}\right] &= E\left[e^{-\frac{\sum X_i}{n}}\right] \\ &= \left(E\left[e^{-\frac{X_1}{n}}\right]\right)^n \\ &= \left(e^{\frac{1}{2n^2}} e^{-\frac{\theta}{n}} \right)^n \\ &= e^{\frac{1}{2n}} e^{-\theta} \neq e^{-\theta} \end{align}

I used in the second line the fact that all the $X_i$ are i.i.d. random variables, and I used Wolfram Alpha to compute the integral $E\left[e^{-\frac{X_1}{n}}\right] = \int_{\mathbb{R}} e^{-\frac{x}{n}} f(x)dx$, where $f$ is the pdf of a $N(\theta,1)$ distribution, although this integral should be relatively straightforward (but rather tedious) to compute by hand.

As we can see, $e^{-\bar X}$ is indeed a biased estimator of $e^{-\theta}$, but the bias becomes smaller and smaller as $n \rightarrow \infty$. However, it's not difficult now to come up with an unbiased estimator of $e^{-\theta}$: Just multiply the original estimator by $e^{-\frac{1}{2n}}$.