Solved – Intuitive explanation of desirable properties (Unbiasedness, Consistency, Efficiency) of statistical estimators

estimationestimatorsinferencemathematical-statisticsunbiased-estimator

From literature I understand that the desirable properties of statistical estimators are

  1. Unbiasedness – we want the estimator to give the correct parameter value theta, on an average, irrespective of the sample size–defined by

enter image description here

  1. Consistency – we want larger sample sizes to give progressively better estimates of the correct parameter value theta and asymptotically converge to theta in probability–defined by

enter image description here

  1. Efficiency – we want the unbiased estimator to have the lowest possible variance–as determined by the Cramer-Rao bound. Efficient estimators, however, need not exist in all situations.

I don't really understand each of these properties and the difference between them. Please explain the intuitive meaning of these properties followed by the math behind it.

Refernce Link – https://www.cs.utah.edu/~suyash/Dissertation_html/node6.html

Best Answer

Unbiasedness means that under the assumptions regarding the population distribution the estimator in repeated sampling will equal the population parameter on average. This is a nice property for the theory of minimum variance unbiased estimators. However I think unbiasedness is overemphasized. The mean square error is a good measure of the accuracy of an estimator. It equals the square of the estimator's bias plus the variance. Sometimes estimators with small bias have smaller mean square error than unbiased estimators that have large variances.

Estimators that are bias can be asymptotically unbiased meaning the bias tends to 0 as the sample size gets large. If the estimator is both asymptotically unbiased and the variance goes to 0 as the sample size gets large then the estimator is consistent (in probability). Technically in measure theory there is a difference between convergence in probability and convergence almost surely. The Cramer Rao Lower bound is a mathematical result that shows in a particular parametric family of distributions that no unbiased estimator can have a variance less than the bound. So if you can show that your estimator achieves the Cramer Rao lower bound you have an efficient estimator.