Solved – UMVUE- geometric distribution where $X$ is the number of failures preceding the first success

estimationgeometric-distributionnegative-binomial-distributionself-studyumvue

$X_1, \dots, X_n$ iis geometric: $P(X=x) = (1-p)^{x}p$, $x=0,1,2, \dots$

My Attempt:

$T=\sum_{i=1}^n X_i$ is a sufficient statistic

$W= \begin{cases}1 & X_1= 0,\\ 0 & X_1\neq 0\end{cases}$
W is an unbiased estimator of $p$

To find UMVUE,
\begin{align}
E[W|T=t] &= \frac{P(X_1 = 0, T=t)}{P(T=t)}\\[5pt]
&= \frac{P(X_1 = 0)P(X_1+\cdots +X_n=t)}{P(T=t)}\\[5pt]
\end{align}

Can somebody please help me expand this step. It's confusing whether I should use $t-1$ or $t-2$ in the combination part of negative binomial pdf.

Best Answer

Since $X_j$ is the number of failures preceding the first success for each $j$, $T=\sum\limits_{j=1}^n X_j$ is the number of failures before the $n$th success. Therefore pmf of $T$ is

$$P(T=t)=\binom{n+t-1}{t}\theta^n(1-\theta)^{t}\,\mathbf1_{t\in\{0,1,2,\ldots\}}$$

Now,

\begin{align} E\left[W\mid T=t\right]&=\frac{P\left[X_1=0,\sum\limits_{i=2}^n X_i=t\right]}{P(T=t)} \\&=\frac{P(X_1=0)P\left[\sum\limits_{j=1}^{n-1} X_j=t\right]}{P(T=t)}\qquad,\,\small j=i-1 \end{align}

So the '$t$' remains as it is; it is a matter of '$n$' and '$n-1$' in the pmf of negative binomial.