Solved – Can bias of an estimate be decreased by increasing sample size

biasconsistencyestimation

I understand that in case of consistent estimates, larger the sample size, there's a higher probability that the estimate converges to true value of parameter. Now, using the sufficient condition of consistency, which has asymptotic unbiasedness as a condition, can I say that bias decreases as sample size increases?

OR

Since unbiasedness is a finite sample property unlike consistency, bias cannot be related to sample size?

Best Answer

The bias of an estimator $\hat \theta_n$ of a parameter $\theta^0$ is defined as

$$B(\hat \theta_n) = E(\hat \theta_n) - \theta^0,$$

where the $n$ subscript indicates that the estimator is a function of the sample size. It follows that the distribution of the estimator is a function of the sample size, meaning that, in general, for each different $n$ the estimator will have a different distribution (maybe only slightly so), which will have a different expected value and so a different bias.

It may not be feasible to obtain information as to how the bias changes (monotonically? non-monotonically?).

Related Question