Proving almost sure convergence of linear regression coefficients

convergence-divergencelinear regressionstatistics

In the context of simple linear regression, suppose that $\epsilon_i, \ i=1,…,n$ are i.i.d and $ |n^{-1}\sum_{i=1}^{n}x_{i}| \rightarrow |\mu| < \infty$ where n $\rightarrow \infty$ and var(x) = $n^{-1}\sum_{i=1}^{n}(x_i-\overline{x_n})^2\rightarrow \alpha \in \mathbb{R}^{+*}.$

Under this assumption, how can we prove that:

i) $\hat{\beta}_1\overset{a.s.}{\to} {\beta}_1$ and $\hat{\beta}_2\overset{a.s.}{\to} {\beta}_2$?

ii) $\widehat{\sigma^2} \overset{a.s.}{\to} \sigma^2$ when $n \to \infty$?

Here, symbols with a hat on top refer to the least square estimators of the coefficients in $y_i=\beta_1+\beta_2x_i+\epsilon_i$, where $\epsilon_i$ is not assumed to be normal. We assume homoskedasticity and zero expectation and zero correlation for the errors (with variance $\sigma^2$).

Best Answer

Observe that $$ \hat{\beta_2}-\beta_2=\frac 1{\sum_{i=1}^n(x_i-\bar{x_n})^2}\sum_{j=1}^n (x_j-\bar{x_n})\left(\varepsilon_j-\frac 1n\sum_{i=1}^n\varepsilon_i\right) $$ Since $\sum_{j=1}^n (x_j-\bar{x_n})=0$ and $n^{-1}\sum_{i=1}^{n}(x_i-\overline{x_n})^2\rightarrow \alpha $, it suffices to prove that $$ \frac 1n\sum_{j=1}^nx_j \varepsilon_j\to 0\mbox{ a.s.} $$ This can be done for example by showing that $\sum_{N\geqslant 1}\mathbb E\left[2^{-2N}\max_{1\leqslant n\leqslant 2^N} \left\lvert x_j \varepsilon_j\right\rvert^2\right]$ is finite.

Related Question