[Math] How to find UMVUE of $\theta^k$ when $x_1, \ldots, x_n$ is a sample from Bernoulli$(\theta)$

parameter estimationprobabilityprobability distributionsstatistical-inferencestatistics

Let $x_1, x_2, \ldots, x_n$ be a random sample from the Bernoulli ($\theta$).

The question is to find the UMVUE of $\theta^k$.

I know the $\sum_1^nx_i$ is the complete sufficient statistics for $\theta$.

Is $\left(\frac{\sum_1^nx_i}{n}\right)^k$ the estimator or any other possible estimator?

Could someone just help me?

Best Answer

Having that

$$\theta^m=P\{ X_1=x_1,X_2=x_2,...,X_m=x_m\}$$

An unbiased estimator for $\theta^m$ is

$$T= \begin{cases} 1, & if \ \ X_1=X_2= \, ... \,=X_m =1 \\ 0, & in \ other \ case \end{cases}$$

But $$\begin{align} E[T|S=s] & = P\{X_1=1,X_2=1,...,X_m=1|S=s\}=\frac{P\{X_1=1,X_2=1,...,X_m=1,S=s\}}{P\{S=s\}} = \\\\ & = \begin{cases} 0, & if \ \ m>s \\ \frac{\theta^m\binom{n-m}{s-m}\theta^{s-m}(1-\theta)^{n-s}}{\binom{n}{s}\theta^s(1-\theta)^{n-s}}, & if \ \ m\leq s \end{cases} \end{align}$$

By the theorem of Lehmann-Scheffé, the UMVUE for $\theta^m$ is, after operating the latter expression:

$$E[T|S=s]=\begin{cases} 0, & if \ \ m>s \\ \frac{s!(n-m)!}{n!(s-m)!}, & if \ \ m\leq s \end{cases}$$