[Math] how to prove that $\hat \sigma^2$ is a consistent for $\sigma^2$

regressionstatistics

Consider a regression model $Y_n=X_n\beta +\varepsilon$, where $X_n$ is a $n \times p_n$ matrix, and $\varepsilon=(\varepsilon_1,…,\varepsilon_n)'$ consists of independent and identically distributed variables with $E(\varepsilon_1)=0$ and $Var(\varepsilon_1)=\sigma^2$. Suppose $\hat \sigma ^2$ is the estimator of $\sigma^2$. Let

$$\hat \sigma ^2=\frac{\left\|Y_n-X_n\hat\beta\right\|^2}{n-p_n}.$$

How to prove that $\hat \sigma^2$ consistent for $\sigma^2$?

Best Answer

If you cannot assume normality then $\hat{e}_i = y_i - x_i'\hat{\beta} = e_i+x_i'\beta - x_i'\hat{\beta}=e_i+x_i'(\beta - \hat{\beta})$, recall that $\hat{\beta}_n \xrightarrow{p}\beta$, thus \begin{align} \frac{1}{n-p}||Y-X\hat{\beta}||^2 &= \frac{1}{n-p}\sum_{i=1}^n\hat{e}_i^2\\ &= \frac{1}{n-p}(\sum_{i=1}^n e^2_i + 2\sum_{i=1}^n e_ix_i'(\beta-\hat{\beta}) + (\beta-\hat{\beta})'\sum_{i=1}^n (x_ix_i')(\beta-\hat{\beta})), \end{align} where the second and the third terms converge in probability to $0$ and the first term goes to $\mathbb E{e_i^2}=\sigma^2$.

[1] Such a proof can be found in Econometrics by Bruce E. Hansen. If you can assume normality of the error term, then you can use the $\chi ^2$ distribution to prove consistency which can be slightly easier.


If you can assume normality then the proof becomes much easier as $$ \mathbb{E}\hat{\sigma}^2 = \sigma^2, $$ then the MSE of $\hat{\sigma}^2$ equals its variance, i.e., \begin{align} \lim_{n\to \infty} MSE(\hat{\sigma}^2) &= \lim_{n\to \infty} Var(\frac{\sigma^2}{n-p} \frac{||Y-X\hat{\beta}||^2}{\sigma^2})\\ & =\lim_{n\to \infty}\frac{\sigma^4}{(n-p)^2} Var( \frac{||Y-X\hat{\beta}||^2}{\sigma^2})\\ & = \lim_{n\to \infty}\frac{2(n-p)\sigma^4}{(n-p)^2}\\ &= \lim_{n\to \infty}\frac{2\sigma^4}{(n-p)}=0. \end{align}

Related Question