[Math] Finding UMVUE of $p^s$ in Bernoulli distribution

probabilitystatistics

Suppose that $X_1, \ldots, X_n$ follows Bernoulli distribution $B(1,p)$,

then what is the UMVUE of $p^s$ and $p^s + (1-p)^{n-s}$?

I suppose I should use the Lehmann–Scheffé theorem. Now $\overline{X}$ is a sufficient and complete statistic, I need to find a function of $\overline{X}$ whose expectation is $p^s$ and $p^s + (1-p)^{n-1}$. But I don't know how to find such a function.

Any hint would be welcome!

Best Answer

You have $$\operatorname{E}(X_1\cdots X_s) = p^s$$ if $s$ is an integer and $1\le s\le n,$ and if $1\le n-s\le n$, then $$\operatorname{E}(X_1\cdots X_s + (1-X_{s+1})\cdots(1-X_n)) = p^s + (1-p)^{n-s}.$$

The estimators that you need are the conditional expected values $$ \operatorname{E}(X_1\cdots X_s\mid \overline X_n) \text{ and } \operatorname{E}((1-X_{s+1})\cdots(1-X_n)\mid \overline{X}_n). $$ Because $\overline X_n$ is sufficient, the conditional expected values above do not depend on $p$ and are therefore observable and can be used as estimators.

\begin{align} & \operatorname{E}\left(X_1\cdots X_s \mid \overline X_n = \frac x n\right) \\ = {} & \Pr\left( X_1\cdots X_s = 1 \mid X_1+\cdots+X_n = x \right) = \frac{\dbinom {n-s}{n-x}}{\dbinom n x} \end{align} So the Lehmann–Scheffé theorem says the UMVUE of $p^s$ is $$ \frac{\dbinom{n-s}{n-(X_1+\cdots+X_n)}}{\dbinom n {X_1+\cdots+X_n}}. $$