Find UMVU estimator for $e^{-3 \theta}$ given a complete sufficient statistic $X \sim Pois(\theta)$ with $\theta>0$.

parameter estimationprobabilityprobability theorystatistical-inferencestatistics

My attempt: We know, since $X\sim Pois(\theta)$ that $\mathbb{P}_{\theta}(X=x)=e^{-\theta}\theta^{x}/x!$. A given tip is that we must recall that $e^{x}=\sum^{\infty}_{k=0}\frac{x^{k}}{k!}$. I know that once we find an unbiased estimator that is a function of our complete sufficient statistic $X$, that this estimator must then automatically be UMVU. However, I'm not sure how to approach this question by even finding an expression for an unbiased estimator.

Question: How to approach/solve this exercise?

Thanks!

Best Answer

We need to find some function $g(k)$ s.t. $\mathbb E[g(X)]=e^{-3\theta}$ for all $\theta>0$. Write this expectation: $$ \mathbb E[g(X)]=\sum_{k=0}^\infty g(k)\dfrac{\theta^k}{k!}e^{-\theta}=e^{-3\theta}. $$ Multiply both parts by $e^\theta$ and get $$ \sum_{k=0}^\infty g(k)\dfrac{\theta^k}{k!}=e^{-2\theta} = \sum_{k=0}^\infty \frac{(-2)^k\theta^k}{k!}. $$ Since these sums are equal for each $\theta>0$, we get $g(k)=(-2)^k$ and $$g(X)=(-2)^X$$ is unique unbiased estimator for $e^{-3\theta}$.