To close this one:

The density is recognized as a Gamma distribution with shape parameter $k=3$ and unknown scale parameter $\theta$, so we have $E(X) = 3\theta$ and $\operatorname {Var}(X) = 3\theta^2$
Given an i.i.d sample, the expected value and variance of the MLE is then

$$E(\hat \theta_{MLE}) = \theta,\;\;\operatorname {Var}(\hat \theta_{MLE}) = \frac 1 {3n}\theta^2$$

For consistency, we can use then the sufficient conditions: $\lim_{n\rightarrow}E(\hat \theta_{MLE})=\theta$, holds, and $\lim_{n\rightarrow}\operatorname {Var}(\hat \theta_{MLE})=0 \Rightarrow \lim_{n\rightarrow}\frac 1 {3n}\theta^2=0$, holds too, so the MLE is consistent.

The Central Limit Theorem Holds and so

$$\sqrt n(\hat \theta_{MLE}-\theta) \rightarrow_d \mathcal N(0, \frac 13\theta^2)$$

The Fisher Information is
$$\mathcal I(\theta) = -E\left[\frac 3 {\theta^2}-2\frac X {\theta^3}\right] = -\frac 3 {\theta^2} + 2\frac {E(X)}{\theta^3} = -\frac 3 {\theta^2} + 2\frac {3\theta}{\theta^3} = \frac 3{\theta^2}$$

and so the MLE achieves the Cramer-Rao bound.

As for the question about minimum variance estimator, @cardinal's answer here

https://math.stackexchange.com/questions/28779/minimum-variance-unbiased-estimator-for-scale-parameter-of-a-certain-gamma-distr

is a complete and sufficient statistic for estimating the answer.

Let's settle this. For a Normal distribution with zero mean and variance $\sigma^2$, the complete sufficient statistic is indeed given by the sum of squares.

By the Rao-Blackwell Theorem if we can find an unbiased function of this sufficient statistic, then we have an MVUE. Furthermore, since the family of this statistic is complete, by the Lehmann-Scheffe theorem, it is also the unique MVUE for $\sigma^2$. So let's find an unbiased function of $\displaystyle{\sum_{i=1}^n X_i ^2}$ for $X\sim N\left (0,\sigma^2 \right)$.

Recall that for any distribution for which these moments exist

$$E(X^2)=\sigma^2+\mu^2$$

which in turn suggests that for this random sample

$$E \left( \sum_{i=1}^n X_i^2 \right)=n\sigma^2$$

from which it immediately follows that the unbiased estimator in question is

$$\widehat{\sigma^2}=\frac{1}{n} \sum_{i=1}^n X_i^2$$

Now, let's find the variance of this estimator. We know that for $X\sim N\left (0,\sigma^2 \right)$,

$$\frac{\sum_{i=1}^n X_i^2}{\sigma^2}\sim \chi^2(n)$$

Be very mindful of the degrees of freedom of the $\chi^2$ distribution. What would happen if we didn't know the mean and used an estimate instead? By the properties of the $\chi^2$ distribution then,

$$var\left(\frac{\sum_{i=1}^n X_i^2}{\sigma^2} \right)=2n$$

and so $var\left( \sum_{i=1}^n X_i^2 \right)=2n\sigma^4$. Thus for our modified estimator $var \left( \frac{1}{n} \sum_{i=1}^n X_i^2\right)=\frac{2\sigma^4}{n}$ which equals the Cramer-Rao bound. This should be comforting, right?

As a final remark, I would like to point out that the Cramer-Rao bound is only attainable if the mean of the normal distribution is known, as in this situation. If that had not been the case, then we would have to settle for an estimator that does not achieve the lower bound of variance.

Hope this clears it up a bit.

## Best Answer

Are you aware of the three regularity conditions that must be satisfied for the CR lower bound to apply? It looks like it violates the condition that the bounds of the distribution function must not depend upon the quantity being estimated. $\theta$ determines the bounds of the distribution. See the Wikipedia article, regularity condition 1: http://en.wikipedia.org/wiki/Cram%C3%A9r%E2%80%93Rao_bound