[Math] Consistency of sample variance $S^2$

statistics

Let $Y_1…Y_n$ be independent $N(\mu,\sigma^2)$ R.V.s. Their sample variance is:

$$ S^2=\sum_{i=1}^n \frac{(Y_i- \overline Y)^2}{(n-1)} $$

Treating $S^2$ as an estimator, is the estimator consistent?

Here is how I would do the problem and guidance would be greatly appreciated!

Approach #1: Can we simply not say that since $S^2$ is being divided by $n-1$ that as $n$ approaches $\infty$, the sample variance gets closer and closer to zero meaning that the sample mean gets finer and finer thus the estimator is consistent?

Approach #2: We can also use the Weak Law of Large Numbers and say that since we have each $Y_i$ independent and identically distributed and since both the mean and variance of $Y_i$ exist, that the Weak Law of Large Numbers is valid thus meaning that the sample variance is consistent.

Is this a correct way of proving the consistency of the sample variance? Thanks so much!

Best Answer

First, note that the sample variance is an unbiased estimator of $\sigma^2$, hence $E[S^2]=\sigma^2$. Now, all that remains to be shown is that the variance of the estimate approaches zero as the sampel size grows. This is shows to be the case, as can be seen in equatoin (25) of this link -- note that the numberator grows as $n^2$ while the denominator grows as $N^3$. So, as the sample size grows, the mean stays at $\sigma^2$ while the variance approaches zero.