To close this one:
The density is recognized as a Gamma distribution with shape parameter $k=3$ and unknown scale parameter $\theta$, so we have $E(X) = 3\theta$ and $\operatorname {Var}(X) = 3\theta^2$
Given an i.i.d sample, the expected value and variance of the MLE is then
$$E(\hat \theta_{MLE}) = \theta,\;\;\operatorname {Var}(\hat \theta_{MLE}) = \frac 1 {3n}\theta^2$$
For consistency, we can use then the sufficient conditions: $\lim_{n\rightarrow}E(\hat \theta_{MLE})=\theta$, holds, and $\lim_{n\rightarrow}\operatorname {Var}(\hat \theta_{MLE})=0 \Rightarrow \lim_{n\rightarrow}\frac 1 {3n}\theta^2=0$, holds too, so the MLE is consistent.
The Central Limit Theorem Holds and so
$$\sqrt n(\hat \theta_{MLE}-\theta) \rightarrow_d \mathcal N(0, \frac 13\theta^2)$$
The Fisher Information is
$$\mathcal I(\theta) = -E\left[\frac 3 {\theta^2}-2\frac X {\theta^3}\right] = -\frac 3 {\theta^2} + 2\frac {E(X)}{\theta^3} = -\frac 3 {\theta^2} + 2\frac {3\theta}{\theta^3} = \frac 3{\theta^2}$$
and so the MLE achieves the Cramer-Rao bound.
As for the question about minimum variance estimator, @cardinal's answer here
https://math.stackexchange.com/questions/28779/minimum-variance-unbiased-estimator-for-scale-parameter-of-a-certain-gamma-distr
is a complete and sufficient statistic for estimating the answer.
Let's settle this. For a Normal distribution with zero mean and variance $\sigma^2$, the complete sufficient statistic is indeed given by the sum of squares.
By the Rao-Blackwell Theorem if we can find an unbiased function of this sufficient statistic, then we have an MVUE. Furthermore, since the family of this statistic is complete, by the Lehmann-Scheffe theorem, it is also the unique MVUE for $\sigma^2$. So let's find an unbiased function of $\displaystyle{\sum_{i=1}^n X_i ^2}$ for $X\sim N\left (0,\sigma^2 \right)$.
Recall that for any distribution for which these moments exist
$$E(X^2)=\sigma^2+\mu^2$$
which in turn suggests that for this random sample
$$E \left( \sum_{i=1}^n X_i^2 \right)=n\sigma^2$$
from which it immediately follows that the unbiased estimator in question is
$$\widehat{\sigma^2}=\frac{1}{n} \sum_{i=1}^n X_i^2$$
Now, let's find the variance of this estimator. We know that for $X\sim N\left (0,\sigma^2 \right)$,
$$\frac{\sum_{i=1}^n X_i^2}{\sigma^2}\sim \chi^2(n)$$
Be very mindful of the degrees of freedom of the $\chi^2$ distribution. What would happen if we didn't know the mean and used an estimate instead? By the properties of the $\chi^2$ distribution then,
$$var\left(\frac{\sum_{i=1}^n X_i^2}{\sigma^2} \right)=2n$$
and so $var\left( \sum_{i=1}^n X_i^2 \right)=2n\sigma^4$. Thus for our modified estimator $var \left( \frac{1}{n} \sum_{i=1}^n X_i^2\right)=\frac{2\sigma^4}{n}$ which equals the Cramer-Rao bound. This should be comforting, right?
As a final remark, I would like to point out that the Cramer-Rao bound is only attainable if the mean of the normal distribution is known, as in this situation. If that had not been the case, then we would have to settle for an estimator that does not achieve the lower bound of variance.
Hope this clears it up a bit.
Best Answer
There are several instances of (2), namely the case where the variance of a UMVU estimator exceeds the Cramer-Rao lower bound. Here are some common examples:
Consider the case $n=1$ separately. Here we are to estimate the parametric function $e^{-\theta}=\delta$ (say) based on $X\sim\mathsf{Poisson}(\theta) $.
Suppose $T(X)$ is unbiased for $\delta$.
Therefore, $$E_{\theta}[T(X)]=\delta\quad,\forall\,\theta$$
Or, $$\sum_{j=0}^\infty T(j)\frac{\delta(\ln (\frac{1}{\delta}))^j}{j!}=\delta\quad,\forall\,\theta$$
That is, $$T(0)\delta+T(1)\delta\cdot\ln\left(\frac{1}{\delta}\right)+\cdots=\delta\quad,\forall\,\theta$$
So we have the unique unbiased estimator (hence also UMVUE) of $\delta(\theta)$:
$$T(X)=\begin{cases}1&,\text{ if }X=0 \\ 0&,\text{ otherwise }\end{cases}$$
Clearly,
\begin{align} \operatorname{Var}_{\theta}(T(X))&=P_{\theta}(X=0)(1-P_{\theta}(X=0)) \\&=e^{-\theta}(1-e^{-\theta}) \end{align}
The Cramer-Rao bound for $\delta$ is $$\text{CRLB}(\delta)=\frac{\left(\frac{d}{d\theta}\delta(\theta)\right)^2}{I(\theta)}\,,$$
where $I(\theta)=E_{\theta}\left[\frac{\partial}{\partial\theta}\ln f_{\theta}(X)\right]^2=\frac1{\theta}$ is the Fisher information, $f_{\theta}$ being the pmf of $X$.
This eventually reduces to $$\text{CRLB}(\delta)=\theta e^{-2\theta}$$
Now take the ratio of variance of $T$ and the Cramer-Rao bound:
\begin{align} \frac{\operatorname{Var}_{\theta}(T(X))}{\text{CRLB}(\delta)}&=\frac{e^{-\theta}(1-e^{-\theta})}{\theta e^{-2\theta}} \\&=\frac{e^{\theta}-1}{\theta} \\&=\frac{1}{\theta}\left[\left(1+\theta+\frac{\theta^2}{2}+\cdots\right)-1\right] \\&=1+\frac{\theta}{2}+\cdots \\&>1 \end{align}
With exactly same calculation this conclusion holds here if there is a sample of $n$ observations with $n>1$. In this case the UMVUE of $\delta$ is $\left(1-\frac1n\right)^{\sum_{i=1}^n X_i}$ with variance $e^{-2\theta}(e^{\theta/n}-1)$.
Here UMVUE of $\theta$ is $\hat\theta=\frac{n-1}{\sum_{i=1}^n X_i}$, as shown here.
Using the Gamma distribution of $\sum\limits_{i=1}^n X_i$, a straightforward calculation shows $$\operatorname{Var}_{\theta}(\hat\theta)=\frac{\theta^2}{n-2}>\frac{\theta^2}{n}=\text{CRLB}(\theta)\quad,\,n>2$$
Since several distributions can be transformed to this exponential distribution, this example in fact generates many more examples.
The UMVUE of $\theta^2$ is $\overline X^2-\frac1n$ where $\overline X$ is sample mean. Among other drawbacks, this estimator can be shown to be not attaining the lower bound. See page 4 of this note for details.