[Math] UMVU estimator for the variance of Bernoulli random variables

parameter estimationvariance

Given $X_1, \dots, X_n$, i.i.d. $\text{Bernoulli}(p)$, is there an UMVUE for $\text{Var}(X_i) = p(1-p)$?

The procedure I wanted to follow:

  1. Compute the Cramér-Rao lower bound for $\text{Var}(X_i)$. (But how do I differentiate with respect to variance?)

  2. Does the unbiased sample variance $s^2$ reach it? If so, we're done.

  3. If not, Rao-Blackwellize $s^2$ with a sufficient, complete statistic (possibly the sum – but how to show that it's complete?)

I would really appreciate some help.

Best Answer

For this setting, $T = X_1 + \dots + X_n$ is a complete sufficient statistic. Hence, by the Lehmann-Scheffe Theorem, if we can find a function of $T$ whose expectation is $p(1-p)$, it is an UMVUE.

In any setup, the sample variance $\frac{1}{n-1}\sum (X_i - \bar{X})^2$ is an unbiased estimate of the variance. Since $X^2 = X$ for Bernoulli random variables, \begin{align*} \frac{1}{n-1}\sum (X_i - \bar{X})^2 &= \frac{1}{n-1} \left(\sum X_i^2 - n \bar{X}^2 \right) \\ &= \frac{1}{n-1} \left(\sum X_i - n \bar{X}^2 \right) \\ &= \frac{T(n-T)}{n(n-1)}. \end{align*}

Hence $\frac{T(n-T)}{n(n-1)}$ is an UMVUE for the variance.

Related Question