Regression – Do Confidence and Prediction Intervals Shrink to a Point with Large Sample Sizes?

confidence intervalprediction intervalregression

My question applies to regression estimates. The formulae for confidence interval:
$$
\hat y \pm t_{\alpha/2, n-2} \sqrt{MSE} \sqrt{1/n + \frac{(x-\bar x)^2}{\sum (x_i – \bar x)^2}}
$$

and prediction interval:
$$
\hat y \pm t_{\alpha/2, n-2} \sqrt{MSE} \sqrt{1 + 1/n + \frac{(x-\bar x)^2}{\sum (x_i – \bar x)^2}}
$$

both show that they decrease with increasing n and they both don't seem to tend to 0. But I remember vaguely someone telling me that CI shrinks to a point.

It would be great if someone could clarify this.

Best Answer

Recall that consistency means that the estimator converges in probability to the parameter. This means that all the distribution of the estimator is arbitrarily concentrated around the parameter.

If you construct a confidence interval based on a consistent estimator, is will thus shrink infinitesimally as the sample size grows.

Related Question