Asymptotics – Exploring Asymptotic Normality and Consistency in Statistical Models

asymptoticsconsistency

I have difficulties understanding the concept of asymptotic normality and consistency.

Take an estimator of a parameter which is consistent and asymptotically normally distributed. Because it is consistent, it converges to the true parameter. I don't understand how it then is still asymptotically distributed normally when it converges to a constant?

Thank you very much in advance!!

Best Answer

Convergence to a constant does not mean that you have an estimator that is exactly equal to this constant at any given time. It just means that given a big enough sample size, you can expect that you estimator will be close to the true value of the parameter.

Asymptotic normality most often includes some sort of a scaling (often by $\sqrt{n}$, where $n$ is the sample size). It means that we are not looking at the estimator per se, but at a scaled version of it which does not converge to a constant.

You can look at it from this way (this explanation is not very precise): without such scaling, when you sample size is big enough, the distribution of the estimator can be approximated by a normal distribution with variance which is decreasing with the sample size. The bigger the sample size, the smaller the variance $\Rightarrow$ in the limit, the variance vanishes completely, hence, convergence to a constant is achieved.

Related Question