[Math] Rao-Blackwell Theorem – Best estimator (Statistics)

statistics

Let $X_1 , … , X_n$ be a series of independent random variables following a Bernoulli distribution with parameter $\theta$. And let $S_n = \sum_1^n X_i$.
We know an unbiased estimator of the variance for the Bernoulli distribution:
$$1/2 * (X_1 – X_2)^2$$

Using the Rao-Blackwell Theorem find the best estimator: (Improved from Rao BlackWell)

$$Z = E_\theta(1/2*(X_1 – X_2)^2 | S_n)$$

Sorry for the poor translation of the problem, the original text is in french. Does anyone have any idea of how to do that? So far neither my book or internet haven't been really helpful…
Thanks in advance

Best Answer

It suffices to find an unbiased estimator that is a function of a complete, sufficient statistic. (This is the Lehmann-Scheffe theorem). The Rao-Blackwell theorem can often be used to guess such estimators. You take some known estimator and condition it on sufficient statistic to give you a "better" estimator that is also a function of the sufficient statistic. Many times, but not always, the new estimator is the UMVUE ("best estimator"). I believe this is the case here, though I have not checked the details. To complete the proof, it suffices to check that the new estimator is unbiased (so that Lehmann-Scheffe applies).

Related Question