Let
$s^2=\frac{n}{n-1}\left(\overline{x^2}-\left(\overline x\right)^2\right)=\frac{1}{n-1}\sum_{i=1}^n \left(x_i-\overline x\right)^2$. Define centered r.v.'s $y_i=x_i-\mathbb Ex_1$ and rewrite sample variance in terms of this r.v.'s:
$$
s^2=\frac{1}{n-1}\sum_{i=1}^n \left(y_i-\overline y\right)^2 = \frac{n}{n-1}\left(\overline{y^2}-\left(\overline y\right)^2\right)=\overline{y^2}-\left(\overline y\right)^2+\frac{s^2}{n}.
$$
Note that $\sigma^2=\text{Var}(x_1)=\mathbb E[y_1^2]$.
Find the limiting distribution of $\sqrt{n}\left(s^2-\sigma^2\right)$:
$$\tag{1}\label{1}
\sqrt{n}\left(s^2-\sigma^2\right) = \sqrt{n}\left(\overline{y^2}-\left(\overline y\right)^2+\frac{s^2}{n} -\sigma^2 \right)=\sqrt{n}\left(\overline{y^2}-\sigma^2 \right) -\sqrt{n}\left(\overline y\right)^2+\sqrt{n}\frac{s^2}{n} .
$$
Next prove that $\sqrt{n}\left(\overline y\right)^2 \xrightarrow{p} 0$ and $\sqrt{n}\,\dfrac{s^2}{n}=\dfrac{s^2}{\sqrt{n}}\xrightarrow{p} 0$ as $n\to\infty$. Indeed, by Slutsky's theorem,
$$\sqrt{n}\left(\overline y\right)^2 = \underbrace{\overline y}_{\begin{array}$\downarrow p \cr 0\end{array}} \cdot \underbrace{\sqrt{n}\left(\overline y\right)}_{\begin{array}$\downarrow d \cr N(0,1)\end{array}}\xrightarrow{d} 0\cdot N(0,1)=0$$
The convergence in distribution to zero implies the convergence in probability.
Next,
$$
\dfrac{s^2}{\sqrt{n}} = s^2\cdot \frac{1}{\sqrt{n}}\xrightarrow{p} \sigma^2\cdot 0=0.$$
We obtain that the second and third terms in r.h.s. of (\ref{1}) tends to zero in probability. Consider the first term:
$$
\sqrt{n}\left(\overline{y^2}-\sigma^2 \right) = \sqrt{n}\left(\overline{y^2}-\mathbb E\left[y_1^2\right] \right) \xrightarrow{d} N(0,\text{Var}(y_1^2))=N(0,\mathbb E\left[y_1^4\right]-\sigma^4).
$$
By Slutsky's theorem,
$$\tag{2}\label{2}
\sqrt{n}(s^2-\sigma^2)\xrightarrow{d}N(0,\mathbb E\left[y_1^4\right]-\sigma^4)= N(0,{\mathbb E}\left[(x_1-\mathbb Ex_1)^4\right]-\sigma^4).
$$
So, you can say that the limiting distribution of $s^2$ is normal with mean $\sigma^2$ and variance $$\dfrac{{\mathbb E}\left[(x_1-\mathbb Ex_1)^4\right]-\sigma^4}{n}.$$ But this words are extremely non-rigorous. The rigorous statement is (\ref{2}).
This counterexample is from the book Limit Distributions for Sums of Independent Random Variables by Gnedenko and Kolmogorov.
Let $X$ have density $\begin{cases}
0 &\text{if} |x|\geq \frac 1e \\
\frac{1}{2|x|\log^2(|x|)} &\text{if} |x|< \frac 1e
\end{cases}$
The authors argue that $f_n$ the density of $\sum_{i=1}^n X_i$ verifies $\displaystyle f_n(x) > \frac{c_n}{|x \log^{n+1}(|x|)|}$ for some positive constant $c_n$ in a neighborhood of $0$. So the density of $Z_n$ (which is a normalized version of $f_n$) is infinite at $0$.
They prove the following theorem:
Theorem: Suppose $X$ has density $f$. If
- for some $m\geq 1$, $f_m$ (the density of $\sum_{i=1}^m X_i$) is in $L^r(\mathbb R)$ for some $r\in (1,2]$,
- $\int x^2 f(x) dx <\infty$ (i.e. $X$ has a second moment)
Then $\displaystyle \sup_{x\in \mathbb R} \left|\sigma \sqrt n f_n(\sigma \sqrt n x) - \frac{1}{\sqrt{2\pi}} e^{-x^2/2} \right| \xrightarrow[n\to \infty]{}0$
In Petrov's Sums of Independent Random Variables, the following theorem is stated:
Theorem: Let $(X_n)$ be a sequence of i.i.d r.v with mean zero and variance $\sigma^2$ and let $f_n$ denote the density of $Z_n$ (if it exists).
Then $\displaystyle \sup_{x\in \mathbb R} \left| f_n(x) - \frac{1}{\sqrt{2\pi}} e^{-x^2/2} \right| \xrightarrow[n\to \infty]{}0$ if and only if $f_n$ is bounded for some $n$.
In Shiryaev's Probability 2, the following Local Central Limit Theorem is stated:
Theorem: Let $(X_n)$ be a sequence of i.i.d r.v with mean zero and variance $\sigma^2$. If for some $r\geq 1$, $\int |\phi_{X_1}(t)|^r dt <\infty$, then $Z_n$ has a density $f_n$ such that
$\displaystyle \sup_{x\in \mathbb R} \left| f_n(x) - \frac{1}{\sqrt{2\pi}} e^{-x^2/2} \right| \xrightarrow[n\to \infty]{}0$
Regarding almost sure convergence, you should have a look at Rao's A Limit Theorem for Densities.
Best Answer
Hint: Delta Method
let we know by central limit theorem $$\sqrt{n}(\bar{X}-\mu_{x})\rightarrow N(0,\sigma^2)$$
By using Taylor series we get $$\sqrt{n}\left(h(\bar{X})-h(\mu_{x})\right)=\sqrt{n} (\bar{X}-\mu_{x})h^{\prime}(\mu_{x}) +\sqrt{n} (\bar{X}-\mu_{x})^2\frac{h^{\prime \prime}(\mu_{x})}{2!}+\cdots$$
lets we use an approximation like (if $h^{\prime}(\mu_{x})\neq 0$)
$$\sqrt{n}\left(h(\bar{X})-h(\mu_{x})\right)\cong \color{red}{\sqrt{n} (\bar{X}-\mu_{x})}h^{\prime}(\mu_{x})\rightarrow N\left(0,\sigma^2 \left(h^{\prime}(\mu_{x})\right)^2\right) $$
Now just $h(t)=\frac{1}{t}$