Let's settle this. For a Normal distribution with zero mean and variance $\sigma^2$, the complete sufficient statistic is indeed given by the sum of squares.
By the Rao-Blackwell Theorem if we can find an unbiased function of this sufficient statistic, then we have an MVUE. Furthermore, since the family of this statistic is complete, by the Lehmann-Scheffe theorem, it is also the unique MVUE for $\sigma^2$. So let's find an unbiased function of $\displaystyle{\sum_{i=1}^n X_i ^2}$ for $X\sim N\left (0,\sigma^2 \right)$.
Recall that for any distribution for which these moments exist
$$E(X^2)=\sigma^2+\mu^2$$
which in turn suggests that for this random sample
$$E \left( \sum_{i=1}^n X_i^2 \right)=n\sigma^2$$
from which it immediately follows that the unbiased estimator in question is
$$\widehat{\sigma^2}=\frac{1}{n} \sum_{i=1}^n X_i^2$$
Now, let's find the variance of this estimator. We know that for $X\sim N\left (0,\sigma^2 \right)$,
$$\frac{\sum_{i=1}^n X_i^2}{\sigma^2}\sim \chi^2(n)$$
Be very mindful of the degrees of freedom of the $\chi^2$ distribution. What would happen if we didn't know the mean and used an estimate instead? By the properties of the $\chi^2$ distribution then,
$$var\left(\frac{\sum_{i=1}^n X_i^2}{\sigma^2} \right)=2n$$
and so $var\left( \sum_{i=1}^n X_i^2 \right)=2n\sigma^4$. Thus for our modified estimator $var \left( \frac{1}{n} \sum_{i=1}^n X_i^2\right)=\frac{2\sigma^4}{n}$ which equals the Cramer-Rao bound. This should be comforting, right?
As a final remark, I would like to point out that the Cramer-Rao bound is only attainable if the mean of the normal distribution is known, as in this situation. If that had not been the case, then we would have to settle for an estimator that does not achieve the lower bound of variance.
Hope this clears it up a bit.
Best Answer
While the CRLB is an inequality, and there is in general no reason for CRLB to hold with equality, it is in fact possible to say something about that possibility. A good book of theoretical statistics that does so, is Young and Smith: Essentials of Statistical Inference. I will try to review here what they do (around page 125.)
Let $W(X)$ be an unbiased estimator for the scalar parameter $\theta$. Then the CRLB is $\DeclareMathOperator{\V}{\mathbb{V}} \V W(X) \ge \frac1{i(\theta)}$, where $i(\theta)$ is the Fisher information. The proof of this uses the correlation inequality (a version of the Cauchy-Schwarz inequality) $$ \DeclareMathOperator{\C}{\mathbb{C}} \C (Y,Z) \le \V(Y) \V(Z) $$ with $Y=W(X), Z=\frac{\partial}{\partial \theta} \log f(X; \theta)$. Equality is only possible if $\DeclareMathOperator{\Cor}{\mathbb{Cor}} \Cor(Y,Z)=\pm 1$, which only is possible if $Y$ and $Z$ are proportional to each other (as functions of $X$ for each $\theta$.)
So it is necessary that $$ \frac{\partial}{\partial \theta} \log f(X; \theta) = a(\theta) \left( W(X)-\theta\right) $$ for some functions $a(\theta)$. Now on integration $$ \log f(X;\theta) = A(\theta) W(X) + B(\theta) + C(X) $$ for some functions $A, B, C$. This says that $F(X;\theta)$ is an exponential family model.
Conclusion: For equality in the CRLB to be possible, the model must be an exponential family. Note that this is necessary, but not suficient, the argument above do give not only an exponential family, but is is also parametrized such that $\DeclareMathOperator{\E}{\mathbb{E}} \E W(X)=\theta$.