[Math] Finding UMVUE for Poisson distribution using Rao Blackwell

statistical-inferencestatistics

Let $X_1,X_2,\ldots,X_n$ be a random sample from a Poisson distribution with parameter $\lambda$. Let $\gamma(\lambda)=P(X\le 1)$.
Find UMVUE for $\gamma(\lambda)$.

This is my attempt:
First I defined a indicator function as $T'=I_{(X_1\le 1)}=\begin{cases}
1, & \text{if $X_1\le1$ } \\
0, & \text{otherwise} \\
\end{cases}$

which is unbiased estimator for $\gamma(\lambda)$. Also since this belongs to one parameter exponential family $T=\sum X_i$ is a sufficient statistic for $\gamma(\lambda)$.

Then by Rao Blackwell Theorem,
$E[T'|T=t]$ is a UMVUE for $\gamma(\lambda).$

\begin{align}
& \operatorname E[T'\mid T=t]=1\cdot P[T'=1\mid T=t]+0\cdot P[T'=0\mid T=t] \\[10pt]
= {} & {P[X_1\le 1 \text{ and } X_1+ X_2+ X_3+\cdots+ X_n=t] }\over P[\sum X_i=t]
\end{align}

Since at this moment$ X_1\le 1$ it becomes that $X_2+ X_3+\cdots X_n\ge t-1$.Then since $ X_1\le 1$ and $X_2+ X_3+\cdots+X_n\ge t-1$ are independent from one another

\begin{align}
& P[X_1\le 1 \text{ and } X_1+ X_2+ X_3+\cdots+X_n=t] \\[10pt]
= {} & P[X_1\le 1] \cdots P\left[\sum_{i=2}^n X_i\ge t-1\right]
\end{align}

Now I am stuck in computing $P[\sum_{i=2}^n X_i\ge t-1]$.

$\sum_{i=2}^n X_i$ follows a poisson with parameter $(n-1) \lambda$. In finding $P[\sum_{i=2}^n X_i\ge t-1]$ I came up to cumulative of Poisson
$e^{-(n-1)\lambda}\sum_{i=0}^{y-1}(({n-1)\lambda)}^i / i!$.Is there a way I can simplify this.Please help me to find a UMVUE for this problem.

Best Answer

To simplify the notations, introduce $Y=X_2+\cdots+X_n$ and note that $(X_1,Y)$ is independent with Poisson distributions of parameters $\lambda$ and $\mu=(n-1)\lambda$ respectively and that $T=X_1+Y$ is Poisson with parameter $\lambda+\mu$. For every nonnegative integer $t$, $c(t)=E(T'\mid T=t)$ is $$ c(t)=P(X_1\leqslant1\mid T=t)=\frac{P(X_1=0,Y=t)+P(X_1=1,Y=t-1)}{P(T=t)}. $$ By independence, $$ c(t)=\frac{P(X_1=0)P(Y=t)+P(X_1=1)P(Y=t-1)}{P(T=t)}. $$ The Poisson distributions yield $$ c(t)=\frac{\mathrm e^{-\lambda}\mathrm e^{-\mu}\mu^t/t!+\mathrm e^{-\lambda}\lambda\mathbf 1_{t\geqslant1}\mathrm e^{-\mu}\mu^{t-1}/(t-1)!}{\mathrm e^{-\lambda-\mu}(\lambda+\mu)^{t}/t!}=\frac{\mu^t+\lambda t\mu^{t-1}}{(\lambda+\mu)^{t}}, $$ and finally, $$ c(t)=\left(\frac{n-1}n\right)^t\,\left(1+\frac{t}{n-1}\right). $$ Thus, the UMVUE for $\gamma(\lambda)=P(X_1\leqslant1)=\mathrm e^{-\lambda}(1+\lambda)$ is $$ \hat\gamma(\lambda)=c(T)=\left(\frac{n-1}n\right)^T\,\left(1+\frac{T}{n-1}\right). $$ Edit: Since $T$ is Poisson with parameter $n\lambda$, $$ E(c(T)^2)=\sum_{t=0}^\infty\mathrm e^{-n\lambda}\frac{(n\lambda)^t}{t!}\left(\frac{n-1}n\right)^{2t}\,\left(1+\frac{t}{n-1}\right)^2=\ldots $$