[Math] Finding an unbiased estimator of $e^{-2\lambda}$ for Poisson distribution

poisson distributionprobability theorystatistical-inference

If $X_1,X_2,\ldots,X_n\sim \mathrm{Pois}(\lambda)$, find an unbiased estimator of $e^{-2\lambda}$.

I am actually supposed to find the UMVUE of $e^{-2\lambda}$ but I first have to find its unbiased estimator. I tried using the MLE of $\lambda$ which is $\hat{\lambda}:= \frac{1}{n}\sum_{i=1}^n X_i$ but I'm not sure where to go from there. I know that by invariance property that $e^{-2\hat{\lambda}}$ will be the MLE of $e^{-2\lambda}$ but I'm not sure if it is also unbiased.

Best Answer

I'm going to assume that $X_1, \dots, X_n$ are independent.

I remember seeing this question (or a very similar one) on my Master's qualifying exam, and being rather angry that I couldn't figure out this seemingly simple question out.

Let $\mathbf{I}$ be the indicator function with $\mathbf{I}(A) = 1$ if the statement $A$ is true, and $0$ otherwise. Since $X_1, X_2$ are independent, it follows that $Y = X_1 + X_2 \sim \text{Poisson}(2\lambda)$.

Hence, the PMF of $Y$ is $$f_{Y}(y) = \dfrac{e^{-2\lambda}(2\lambda)^{y}}{y!}$$ for $y = 0, 1, \dots$.

Observe that $f_{Y}(0) = e^{-2\lambda}$.

Recall that the expected value of the indicator function based on an event is the probability of the event. Hence, $$\mathbf{I}(Y = 0) = \mathbf{I}(X_1 + X_2 = 0)$$ is an unbiased estimator of $e^{-2\lambda}$.