The variance of the estimator in the course notes is based on maximum likelihood estimation which typically results in biased estimators. The second variance calculation has a "correction" term that makes the estimator unbiased. You have likely seen this phenomenon with the unbiased estimator for the sample mean, i.e., dividing by $n-1$ instead of $n$.
There are lots of different ways to generate estimators and the resulting estimators will have different properties. One property that many people like in their estimators is for them to be unbiased. However, people are sometimes willing to accept a little bias to reduce variance. If you interested in this topic you might want to look up bias-variance tradeoff.
I noticed several years after my original answer there is a small typo in your derivation that makes a difference: $\frac{(N-1) \hat{\sigma}^2}{\sigma^2} \sim \chi_{n-1}^2$. You will still get a different answer to the one in the notes if you start with this instead of $\frac{N \hat{\sigma}^2}{\sigma^2}$. Starting with this you should find that
$$
\textrm{var}\; (\hat{\sigma}^2) = \frac{2\sigma^4}{N-1}.
$$
Note the well known result that:
$$
\frac{(n-1)S^2}{\sigma^2} \sim \chi^2_{n-1}
$$
(chi-squared distribution with $n-1$ degrees of freedom), and further, the variance of $\chi^2_{n-1}$ is $2(n-1)$.
Now, using this information:
\begin{align*}
2(n-1) = Var \left ( \frac{(n-1)S^2}{\sigma^2} \right) & = \frac{(n-1)^2}{\sigma^4} Var(S^2)
\end{align*}
and so:
$$
Var(S^2) = \frac{2(n-1)\sigma^4}{(n-1)^2} = \frac{2\sigma^4}{n-1}
$$
So your answer is a bit off, perhaps you could share your steps/assumptions for some advice on where you might have gone wrong
Proof of the result:
note the following useful facts:
- For standard normal $Z \sim N(0,1)$, $Z^2 \sim \chi^2_{1}$
- $\bar{X} \bot S^2$ (independence)
- The moment generating function of a $\chi_p^2$ is $m_{\chi_p^2}(t)= (1-2t)^{-p/2}$
- If $X_1, \dots, X_n$ are independent and $X_i = \chi^2_{p_i}$ then:
$$
\sum_{i=1}^n X_i \sim \chi^2_{p1 + p2 + \dots + p_n}
$$
The trick for these types of proofs is to use this fact:
\begin{align*}
&\sum (X_i - \mu)^2 = (n-1)S^2 + n(\bar{X} - \mu)^2\\
\implies &\sum \left ( \frac{X_i - \mu}{\sigma} \right )^2 = \frac{(n-1)S^2}{\sigma^2} + n \left ( \frac{\bar{X} - \mu}{\sigma} \right ) ^2\\
\implies &\sum \left ( \frac{X_i - \mu}{\sigma} \right )^2 = \frac{(n-1)S^2}{\sigma^2} + \left ( \frac{\bar{X} - \mu}{\sigma / \sqrt{n}} \right ) ^2\\
\implies & A = B+C
\end{align*}
By fact number 4
$$
A = \sum \left ( \frac{X_i - \mu}{\sigma} \right )^2 \sim \chi_{n}^2
$$
and by fact number 1
$$
C = \left ( \frac{\bar{X} - \mu}{\sigma / \sqrt{n}} \right ) ^2 = Z^2 = \chi_1^2
$$
Using facts number 2 and number 3:
\begin{align*}
&m_A(t) = m_B(t) m_C(t)\\
\implies &(1-2t)^{-n/2} = M_B(t) (1-2t)^{-1/2}\\
\implies & M_{B}(t) = \frac{(1-2t)^{-n/2}}{(1-2t)^{-1/2}}\\
\implies & M_{B}(t) = (1-2t)^{-(n-1)/2}\\
\end{align*}
which you should note is the moment generating function of a chi square distributed random variable with degrees of freedom $n-1$. The variance follows easily from there.
Best Answer
Let $n$ be the sample size. Since $S^2$ is calculated from the sample generated by the $X_i$ we know that
$$X_i \sim N(\mu, \sigma^2)$$ $$\frac{(n-1)S^2}{\sigma^2}\sim \chi^2(n-1)$$
The distribution of the sum of a Gaussian rv and a Chi-Squared rv is an instance of the Generalized Chi-Squared Distribution. A variable $\xi$ with the Generalized Chi-Squared Distribution can be defined as follows:
$$\xi = x +\sum_1^n w_i y_i \text { where } x\sim N(m,s),\;\; w_i \in \mathbb{R},\;\;y_i \sim \chi'^2(k_i,\lambda_i)\text{ and } y_i \text { independent}$$
Note that $\chi'^2(k_i,\lambda_i)$ is the non-central Chi-squared distribution, it is related to the Chi-squared as follows:
$$\chi^2(n) = \chi'^2(n,0)$$
In your case, we want the sum of a single Chi-Squared variable and a Normal, where the Chi-Squared is based on the sample variance of the sample from the Normal random variable:
$$\xi = x + wy \;\;\text{ where } x\sim N\left(\mu,\frac{\sigma^2}{n}\right), \;y \sim \chi'^2(n-1,0)$$
What about the weight variable $w$? If $w=1$ then $\xi$ represents the following sum:
$$\bar{X} + \frac{(n-1)S^2}{\sigma^2}$$
But we want:
$$\bar{X} + S^2$$
Therefore, to get to this sum, we need to alter the weight applied to the chi-squared variable $y$:
$$w = \frac{\sigma^2}{n-1}$$
With this we have what we need:
$$\bar{X} + S^2 \sim \xi = x + wy \;\;\\\text{ where } x\sim N\left(\mu,\frac{\sigma^2}{n}\right), \;y \sim \chi'^2(n-1,0),\;w = \frac{\sigma^2}{n-1}$$
In summary
$$\bar{X} + S^2 \sim \tilde{\chi}^2\left(\frac{\sigma^2}{n-1},n-1,0,\mu,\sigma\right) $$