Estimation – Cramer-Rao Lower Bound: A Comprehensive Study on Point Estimation and Inference

estimationinferencepoint-estimationself-study

Let $X_1,..,X_n$ be an iid sample of $N(0,\sigma^2)$. Find an unbiased
estimator of $\sigma^2$ and its lower bound.

I found that $$\hat{\sigma}^2 = \sum_{i=1}^{n} X_i^2$$ is an unbiased estimator for $\sigma^2$ and found that its Cramer-Rao Lower Bound is $\frac{1}{n^2}$. Additionally, I think that $\hat{\sigma}^2$ is a complete and sufficient statistics for $\sigma^2$ and hence an UMVUE.

Nonetheless, $$V(\sum_{i=1}^{n} X_i^2)=\sum_{i=1}^{n} V(X_i^2)=n V(X^2)=2n\sigma^4.$$ If the estimator is UMVUE, shouldn't its variance achieve the Cramer-Rao Lower Bound?

Best Answer

Let's settle this. For a Normal distribution with zero mean and variance $\sigma^2$, the complete sufficient statistic is indeed given by the sum of squares.

By the Rao-Blackwell Theorem if we can find an unbiased function of this sufficient statistic, then we have an MVUE. Furthermore, since the family of this statistic is complete, by the Lehmann-Scheffe theorem, it is also the unique MVUE for $\sigma^2$. So let's find an unbiased function of $\displaystyle{\sum_{i=1}^n X_i ^2}$ for $X\sim N\left (0,\sigma^2 \right)$.

Recall that for any distribution for which these moments exist

$$E(X^2)=\sigma^2+\mu^2$$

which in turn suggests that for this random sample

$$E \left( \sum_{i=1}^n X_i^2 \right)=n\sigma^2$$

from which it immediately follows that the unbiased estimator in question is

$$\widehat{\sigma^2}=\frac{1}{n} \sum_{i=1}^n X_i^2$$

Now, let's find the variance of this estimator. We know that for $X\sim N\left (0,\sigma^2 \right)$,

$$\frac{\sum_{i=1}^n X_i^2}{\sigma^2}\sim \chi^2(n)$$

Be very mindful of the degrees of freedom of the $\chi^2$ distribution. What would happen if we didn't know the mean and used an estimate instead? By the properties of the $\chi^2$ distribution then,

$$var\left(\frac{\sum_{i=1}^n X_i^2}{\sigma^2} \right)=2n$$

and so $var\left( \sum_{i=1}^n X_i^2 \right)=2n\sigma^4$. Thus for our modified estimator $var \left( \frac{1}{n} \sum_{i=1}^n X_i^2\right)=\frac{2\sigma^4}{n}$ which equals the Cramer-Rao bound. This should be comforting, right?

As a final remark, I would like to point out that the Cramer-Rao bound is only attainable if the mean of the normal distribution is known, as in this situation. If that had not been the case, then we would have to settle for an estimator that does not achieve the lower bound of variance.

Hope this clears it up a bit.