Equivalently, you want to prove $S^2$ has mean $\nu$ and variance $2\nu$, where $\nu:=n-1$ is the number of degrees of freedom viz. $S^2\sim\chi_\nu^2$. Since means are additive, and so are variances for uncorrelated variables, we only need to check the case $\nu=1$. In that case we want to show $S^2,\,S^4$ have respective means $1,\,1^2+2=3$, i.e. that these are the respective means of $Z^2,\,Z^4$ for $Z\sim N(0,\,1)$. The first result follows from $Z$ having mean $0$ and variance $1$, so the hard part is proving $\mathbb{E}Z^4=3$. There are several ways to do this, but note that$$\int_{\Bbb R}\exp -\alpha x^2dx=\sqrt{\pi}\alpha^{-1/2}\implies\int_{\Bbb R}x^4\exp -\alpha x^2 dx=\frac{3}{4}\sqrt{2\pi}\alpha^{-5/2}$$(by applying $\partial_\alpha^2$), so$$\mathbb{E}Z^4=\int_{\Bbb R}\frac{1}{\sqrt{2\pi}}x^4\exp -\frac{x^2}{2}dx=\frac{3}{4 (1/2)^2}=3.$$You could also use characteristic or moment- or cumulant-generating functions.
Let $n$ be the sample size. Since $S^2$ is calculated from the sample generated by the $X_i$ we know that
$$X_i \sim N(\mu, \sigma^2)$$
$$\frac{(n-1)S^2}{\sigma^2}\sim \chi^2(n-1)$$
The distribution of the sum of a Gaussian rv and a Chi-Squared rv is an instance of the Generalized Chi-Squared Distribution. A variable $\xi$ with the Generalized Chi-Squared Distribution can be defined as follows:
$$\xi = x +\sum_1^n w_i y_i \text { where } x\sim N(m,s),\;\; w_i \in \mathbb{R},\;\;y_i \sim \chi'^2(k_i,\lambda_i)\text{ and } y_i \text { independent}$$
Note that $\chi'^2(k_i,\lambda_i)$ is the non-central Chi-squared distribution, it is related to the Chi-squared as follows:
$$\chi^2(n) = \chi'^2(n,0)$$
In your case, we want the sum of a single Chi-Squared variable and a Normal, where the Chi-Squared is based on the sample variance of the sample from the Normal random variable:
$$\xi = x + wy \;\;\text{ where } x\sim N\left(\mu,\frac{\sigma^2}{n}\right), \;y \sim \chi'^2(n-1,0)$$
What about the weight variable $w$? If $w=1$ then $\xi$ represents the following sum:
$$\bar{X} + \frac{(n-1)S^2}{\sigma^2}$$
But we want:
$$\bar{X} + S^2$$
Therefore, to get to this sum, we need to alter the weight applied to the chi-squared variable $y$:
$$w = \frac{\sigma^2}{n-1}$$
With this we have what we need:
$$\bar{X} + S^2 \sim \xi = x + wy \;\;\\\text{ where } x\sim N\left(\mu,\frac{\sigma^2}{n}\right), \;y \sim \chi'^2(n-1,0),\;w = \frac{\sigma^2}{n-1}$$
In summary
$$\bar{X} + S^2 \sim \tilde{\chi}^2\left(\frac{\sigma^2}{n-1},n-1,0,\mu,\sigma\right) $$
Best Answer
Note the well known result that:
$$ \frac{(n-1)S^2}{\sigma^2} \sim \chi^2_{n-1} $$
(chi-squared distribution with $n-1$ degrees of freedom), and further, the variance of $\chi^2_{n-1}$ is $2(n-1)$.
Now, using this information:
\begin{align*} 2(n-1) = Var \left ( \frac{(n-1)S^2}{\sigma^2} \right) & = \frac{(n-1)^2}{\sigma^4} Var(S^2) \end{align*}
and so:
$$ Var(S^2) = \frac{2(n-1)\sigma^4}{(n-1)^2} = \frac{2\sigma^4}{n-1} $$
So your answer is a bit off, perhaps you could share your steps/assumptions for some advice on where you might have gone wrong
Proof of the result:
note the following useful facts:
$$ \sum_{i=1}^n X_i \sim \chi^2_{p1 + p2 + \dots + p_n} $$
The trick for these types of proofs is to use this fact:
\begin{align*} &\sum (X_i - \mu)^2 = (n-1)S^2 + n(\bar{X} - \mu)^2\\ \implies &\sum \left ( \frac{X_i - \mu}{\sigma} \right )^2 = \frac{(n-1)S^2}{\sigma^2} + n \left ( \frac{\bar{X} - \mu}{\sigma} \right ) ^2\\ \implies &\sum \left ( \frac{X_i - \mu}{\sigma} \right )^2 = \frac{(n-1)S^2}{\sigma^2} + \left ( \frac{\bar{X} - \mu}{\sigma / \sqrt{n}} \right ) ^2\\ \implies & A = B+C \end{align*}
By fact number 4 $$ A = \sum \left ( \frac{X_i - \mu}{\sigma} \right )^2 \sim \chi_{n}^2 $$
and by fact number 1
$$ C = \left ( \frac{\bar{X} - \mu}{\sigma / \sqrt{n}} \right ) ^2 = Z^2 = \chi_1^2 $$
Using facts number 2 and number 3:
\begin{align*} &m_A(t) = m_B(t) m_C(t)\\ \implies &(1-2t)^{-n/2} = M_B(t) (1-2t)^{-1/2}\\ \implies & M_{B}(t) = \frac{(1-2t)^{-n/2}}{(1-2t)^{-1/2}}\\ \implies & M_{B}(t) = (1-2t)^{-(n-1)/2}\\ \end{align*}
which you should note is the moment generating function of a chi square distributed random variable with degrees of freedom $n-1$. The variance follows easily from there.