Note that $X_1=\frac{X-\mu}{\sigma^2}$ follows the standard normal distribution, and so $X_1^2$ has the $\chi^2$ distribution.
Now, $X_1^2=(X^2-2\mu X+\mu^2)/\sigma^4$. Thus you can get the distribution of $X^2$ in terms of distributions of $X_1^2$, $X$ and a constant.
I do not know though, if the distribution of $X^2$ has any standard name or not.
Let $n$ be the sample size. Since $S^2$ is calculated from the sample generated by the $X_i$ we know that
$$X_i \sim N(\mu, \sigma^2)$$
$$\frac{(n-1)S^2}{\sigma^2}\sim \chi^2(n-1)$$
The distribution of the sum of a Gaussian rv and a Chi-Squared rv is an instance of the Generalized Chi-Squared Distribution. A variable $\xi$ with the Generalized Chi-Squared Distribution can be defined as follows:
$$\xi = x +\sum_1^n w_i y_i \text { where } x\sim N(m,s),\;\; w_i \in \mathbb{R},\;\;y_i \sim \chi'^2(k_i,\lambda_i)\text{ and } y_i \text { independent}$$
Note that $\chi'^2(k_i,\lambda_i)$ is the non-central Chi-squared distribution, it is related to the Chi-squared as follows:
$$\chi^2(n) = \chi'^2(n,0)$$
In your case, we want the sum of a single Chi-Squared variable and a Normal, where the Chi-Squared is based on the sample variance of the sample from the Normal random variable:
$$\xi = x + wy \;\;\text{ where } x\sim N\left(\mu,\frac{\sigma^2}{n}\right), \;y \sim \chi'^2(n-1,0)$$
What about the weight variable $w$? If $w=1$ then $\xi$ represents the following sum:
$$\bar{X} + \frac{(n-1)S^2}{\sigma^2}$$
But we want:
$$\bar{X} + S^2$$
Therefore, to get to this sum, we need to alter the weight applied to the chi-squared variable $y$:
$$w = \frac{\sigma^2}{n-1}$$
With this we have what we need:
$$\bar{X} + S^2 \sim \xi = x + wy \;\;\\\text{ where } x\sim N\left(\mu,\frac{\sigma^2}{n}\right), \;y \sim \chi'^2(n-1,0),\;w = \frac{\sigma^2}{n-1}$$
In summary
$$\bar{X} + S^2 \sim \tilde{\chi}^2\left(\frac{\sigma^2}{n-1},n-1,0,\mu,\sigma\right) $$
Best Answer
We can avoid using the fact that $X^2\sim\sigma^2\chi_1^2$, where $\chi_1^2$ is the chi-squared distribution with $1$ degree of freedom, and calculate the expected value and the variance just using the definition. We have that $$ \operatorname E X^2=\operatorname{Var}X=\sigma^2 $$ since $\operatorname EX=0$ (see here).
Also, $$ \operatorname{Var}X^2=\operatorname EX^4-(\operatorname EX^2)^2. $$ The fourth moment $\operatorname EX^4$ is equal to $3\sigma^4$ (see here). Hence, $$ \operatorname{Var}X^2=3\sigma^4-\sigma^4=2\sigma^4. $$