[Math] How to find a MVUE for a certain function of a parameter

estimation-theoryparameter estimationstatistics

The following is one of the exercises from my course in statistics

Let $X_1, \ldots, X_n$ be a random sample from a Poisson distribution with parameter $\theta > 0$. Find the MVUE for $q(\theta) = e^{-\theta}(1 + \theta)$.

We didn't do this exercise in class (nor did we get a solution), but it was recommended to do it at home, as its solution required the use of some of the more theoretical results from estimation theory.

This is what I've got so far

  1. I was able to prove that
    $$
    T(X_1, \ldots, X_n) = \sum_{i=1}^n X_i
    $$
    is a complete and sufficient statistic. Note that, since the $X_i$ are independent and Poisson distributed with parameter $\theta$, $T$ is Poisson distributed with parameter $n\theta$.
  2. I then showed that
    $$
    S(X_1, \ldots, X_n) = \frac{\hat{F}_n(1)}{n}
    $$
    is an unbiased estimator for $g(\theta) = P(X_1 = 0) + P(X_1 = 1)$, where $\hat{F}_n$ is the empirical distribution function.

I'm quite sure we should conclude by using either (probably both) the Rao–Blackwell or the Lehmann–Scheffé theorem. The Rao-Blackwell theorem is not very clear to me, so I don't really know how to proceed. As a side note: if anyone knows of a good online source for this theorem, be sure te let me know. The version in my course notes is quite confusing.

Thank you in advance for your help.

Best Answer

You've already observed that $$ q(\theta) = e^{-\theta}(\theta+1) = \Pr( X_1=0 \text{ or } X_1=1). $$ So let $$ Y = \begin{cases} 1 & \text{if }X_1=0\text{ or }X_1=1, \\ 0 & \text{if }X_1\ge2. \end{cases} $$ Then $Y$ is an unbiased estimator of $q(\theta)$. You need to find

$$ \operatorname{E}(Y\mid X_1+\cdots+X_n) = \Pr(Y=1\mid X_1+\cdots+X_n). $$

We have: \begin{align} & \Pr(Y=1\mid X_1+\cdots+X_n = x) \\[10pt] = {} & \Pr(X_1=0\mid X_1+\cdots+X_n=x) + \Pr(X_1=1\mid X_1+\cdots+X_n=x) \\[10pt] = {} & \frac{\Pr(X_1=0\ \&\ \overbrace{X_2+\cdots+X_n}^\text{starting with 2} =x)}{\Pr(X_1+\cdots+X_n=x)} + \frac{\Pr(X_1=1\ \&\ \overbrace{X_2+\cdots+X_n=x-1}^\text{starting with 2}\,)}{\Pr(X_1+\cdots+X_n=x)} \\[10pt] = {} & \text{etc. }\cdots\cdots \end{align} Can you do the rest? All of the $\theta$s should cancel out.

Rao–Blackwell proves the resulting estimator is no worse than the crude estimator $Y$ (in fact it's immensely better) and Lehmann–Scheffé, which is the part for which the hypothesis of completeness is needed, proves it's actually the MVUE.

Related Question