I didn't check that reference, but I guess they are assuming that $Y_i$'s are independent with $E(Y_i)=\mu$ and $Var(Y_i)=\sigma^2$ for $i=1,2,...,n$ i.e. all the observation has the same (finite) mean $\mu$ and (finite) variance $\sigma^2$. So first note that $E(Y_i^2)=Var(Y_i)+E^2(Y_i)=\sigma^2+\mu^2$. Also for $\bar{Y}=\dfrac{\sum_{i=1}^n Y_i}{n}$ we have: $E(\bar{Y})=\dfrac{\sum_{i=1}^n E(Y_i)}{n}=\dfrac{n\mu}{n}=\mu$. In addition, by using independency among $Y_i$'s, we have: $Var(\bar{Y})=\dfrac{\sum_{i=1}^n Var(Y_i)}{n^2}=\dfrac{n\sigma^2}{n^2}=\dfrac{\sigma^2}{n}$. Now it is easy to find $E(\bar{Y}^2)=Var(\bar{Y})+E^2(\bar{Y})=\sigma^2/n+\mu^2$. You should take an expectation from $\bar{Y}^2$ in the last line you wrote as well, i.e. $E(\hat{\sigma}^2)=\dfrac{1}{n}E(\sum_{i=1}^n Y_i^2)-E(\bar{Y}^2)=\dfrac{1}{n}.n.E(Y_i^2)-\sigma^2/n-\mu^2$. Now replace $E(Y_i^2)=\sigma^2+\mu^2$ to get $E(\hat{\sigma}^2)=\sigma^2-\sigma^2/n$.
To side-step dependencies arising when we consider the sample variance, we write
$$(n-1)s^2 = \sum_{i=1}^n\Big((X_i-\mu) -(\bar x-\mu)\Big)^2$$
$$=\sum_{i=1}^n\Big(X_i-\mu\Big)^2-2\sum_{i=1}^n\Big((X_i-\mu)(\bar x-\mu)\Big)+\sum_{i=1}^n\Big(\bar x-\mu\Big)^2$$
and after a little manipualtion,
$$=\sum_{i=1}^n\Big(X_i-\mu\Big)^2 - n\Big(\bar x-\mu\Big)^2$$
Therefore
$$\sqrt n(s^2 - \sigma^2) = \frac {\sqrt n}{n-1}\sum_{i=1}^n\Big(X_i-\mu\Big)^2 -\sqrt n \sigma^2- \frac {\sqrt n}{n-1}n\Big(\bar x-\mu\Big)^2 $$
Manipulating,
$$\sqrt n(s^2 - \sigma^2) = \frac {\sqrt n}{n-1}\sum_{i=1}^n\Big(X_i-\mu\Big)^2 -\sqrt n \frac {n-1}{n-1}\sigma^2- \frac {n}{n-1}\sqrt n\Big(\bar x-\mu\Big)^2 $$
$$=\frac {n\sqrt n}{n-1}\frac 1n\sum_{i=1}^n\Big(X_i-\mu\Big)^2 -\sqrt n \frac {n-1}{n-1}\sigma^2- \frac {n}{n-1}\sqrt n\Big(\bar x-\mu\Big)^2$$
$$=\frac {n}{n-1}\left[\sqrt n\left(\frac 1n\sum_{i=1}^n\Big(X_i-\mu\Big)^2 -\sigma^2\right)\right] + \frac {\sqrt n}{n-1}\sigma^2 -\frac {n}{n-1}\sqrt n\Big(\bar x-\mu\Big)^2$$
The term $n/(n-1)$ becomes unity asymptotically. The term $\frac {\sqrt n}{n-1}\sigma^2$ is determinsitic and goes to zero as $n \rightarrow \infty$.
We also have $\sqrt n\Big(\bar x-\mu\Big)^2 = \left[\sqrt n\Big(\bar x-\mu\Big)\right]\cdot \Big(\bar x-\mu\Big)$. The first component converges in distribution to a Normal, the second convergres in probability to zero. Then by Slutsky's theorem the product converges in probability to zero,
$$\sqrt n\Big(\bar x-\mu\Big)^2\xrightarrow{p} 0$$
We are left with the term
$$\left[\sqrt n\left(\frac 1n\sum_{i=1}^n\Big(X_i-\mu\Big)^2 -\sigma^2\right)\right]$$
Alerted by a lethal example offered by @whuber in a comment to this answer, we want to make certain that $(X_i-\mu)^2$ is not constant. Whuber pointed out that if $X_i$ is a Bernoulli $(1/2)$ then this quantity is a constant. So excluding variables for which this happens (perhaps other dichotomous, not just $0/1$ binary?), for the rest we have
$$\mathrm{E}\Big(X_i-\mu\Big)^2 = \sigma^2,\;\; \operatorname {Var}\left[\Big(X_i-\mu\Big)^2\right] = \mu_4 - \sigma^4$$
and so the term under investigation is a usual subject matter of the classical Central Limit Theorem, and
$$\sqrt n(s^2 - \sigma^2) \xrightarrow{d} N\left(0,\mu_4 - \sigma^4\right)$$
Note: the above result of course holds also for normally distributed samples -but in this last case we have also available a finite-sample chi-square distributional result.
Best Answer
Note that $S^2$ has terms involving $X_i^4$, and so $E[S^2]$ is the sum of terms involving $E[X_i^4]$. Thus, if the fourth moment is not finite, neither is $E[S^2]$ finite, nor is var$(S^2)$ finite. Some people say that various quantities such as expectations, variances, etc must be said to be undefined when the corresponding integrals/sums diverge. Others reserve the term "undefined" for cases when the integrals/sums lead to indeterminate forms such as $\infty - \infty$. The latter group would say that for a Cauchy random variable $X$, $E[X]$ is undefined, $E[X^2]$ is defined (but unbounded), and var$(X)$ is undefined (since $E[X]$ is undefined and so var$(X) = E[X^2] - (E[X])^2$ makes no sense). The former group would say that $E[X]$, $E[X^2]$, and var$(X)$ are all undefined for a Cauchy random variable.