If one is familiar with the concepts of sufficiency and completeness, then this problem is not too difficult. Note that $f(x; \theta)$ is the density of a $\Gamma(2, \theta)$ random variable. The gamma distribution falls within the class of the exponential family of distributions, which provides rich statements regarding the construction of uniformly minimum variance unbiased estimators via notions of sufficiency and completeness.
The distribution of a random sample of size $n$ from this distribution is
$$
g(x_1,\ldots,x_n; \theta) = \theta^{2n} \exp\Big(-\theta \sum_{i=1}^n x_i + \sum_{i=1}^n \log x_i\Big)
$$
which, again, conforms to the exponential family class.
From this we can conclude that $S_n = \sum_{i=1}^n X_i$ is a complete, sufficient statistic for $\theta$. Operationally, this means that if we can find some function $h(S_n)$ that is unbiased for $\theta$, then we know immediately via the Lehmann-Scheffe theorem that $h(S_n)$ is the unique uniformly minimum variance unbiased (UMVU) estimator.
Now, $S_n$ has distribution $\Gamma(2n, \theta)$ by standard properties of the gamma distribution. (This can be easily checked via the moment-generating function.)
Furthermore, straightforward calculus shows that
$$
\mathbb{E} S_n^{-1} = \int_0^\infty s^{-1} \frac{\theta^{2n} s^{2n - 1}e^{-\theta s}}{\Gamma(2n)} \,\mathrm{d}s = \frac{\theta}{2n - 1} \>.
$$
Hence, $h(S_n) = \frac{2n-1}{S_n}$ is unbiased for $\theta$ and must, therefore, be the UMVU estimator.
Addendum: Using the fact that $\newcommand{\e}{\mathbb{E}}\e S_n^{-2} = \frac{\theta^2}{(2n-1)(2n-2)}$, we conclude that the $\mathbb{V}ar(h(S_n)) = \frac{\theta^2}{2(n-1)}$. On the other hand, the information $I(\theta)$ from a sample of size one is readily computed to be $-\e \frac{\partial^2 \log f}{\partial \theta^2} = 2 \theta^{-2}$ and so the Cramer-Rao lower bound for a sample of size $n$ is
$$
\mathrm{CRLB}(\theta) = \frac{1}{n I(\theta)} = \frac{\theta^2}{2n} \> .
$$
Hence, $h(S_n)$ does not achieve the bound, though it comes close, and indeed, achieves it asymptotically.
However, if we reparametrize the density by taking $\beta = \theta^{-1}$ so that
$$
f(x;\beta) = \beta^{-2} x e^{-x/\beta},\quad x > 0,
$$
then the UMVU estimator for $\beta$ can be shown to be $\tilde{h}(S_n) = \frac{S_n}{2 n}$. (Just check that it's unbiased!) The variance of this estimator is $\mathbb{V}ar(\tilde{h}(S_n)) = \frac{\beta^2}{2n}$ and this coincides with the CRLB for $\beta$.
The point of the addendum is that the ability to achieve (or not) the CRLB depends on the particular parametrization used and even when there is a one-to-one correspondence between two unique parametrizations, an unbiased estimator for one may achieve the Cramer-Rao lower bound while the other one does not.
Equivalently, you want to prove $S^2$ has mean $\nu$ and variance $2\nu$, where $\nu:=n-1$ is the number of degrees of freedom viz. $S^2\sim\chi_\nu^2$. Since means are additive, and so are variances for uncorrelated variables, we only need to check the case $\nu=1$. In that case we want to show $S^2,\,S^4$ have respective means $1,\,1^2+2=3$, i.e. that these are the respective means of $Z^2,\,Z^4$ for $Z\sim N(0,\,1)$. The first result follows from $Z$ having mean $0$ and variance $1$, so the hard part is proving $\mathbb{E}Z^4=3$. There are several ways to do this, but note that$$\int_{\Bbb R}\exp -\alpha x^2dx=\sqrt{\pi}\alpha^{-1/2}\implies\int_{\Bbb R}x^4\exp -\alpha x^2 dx=\frac{3}{4}\sqrt{2\pi}\alpha^{-5/2}$$(by applying $\partial_\alpha^2$), so$$\mathbb{E}Z^4=\int_{\Bbb R}\frac{1}{\sqrt{2\pi}}x^4\exp -\frac{x^2}{2}dx=\frac{3}{4 (1/2)^2}=3.$$You could also use characteristic or moment- or cumulant-generating functions.
Best Answer
$$ E\left[\frac{(X-\mu)^2}{\sigma^2}\right]=\frac{1}{\sigma^2}E\left[(X-\mu)^2\right]=\frac{\sigma^2}{\sigma^2} $$