Regression – How Autocorrelation Affects OLS Coefficient Standard Errors?

autocorrelationregressionstandard error

It seems that OLS residuals autocorrelation is not always an issue, depending on the problem at hand. But why residuals autocorrelation would affect the coefficient standard errors? From the Wikipedia article on autocorrelation:

While it does not bias the OLS coefficient estimates, the standard
errors tend to be underestimated (and the t-scores overestimated) when
the autocorrelations of the errors at low lags are positive.

Best Answer

Suppose your OLS regression is well specified and contains all the right explanatory variables, but you have an unspecified correlation structure of the residuals: $$ y_t = x_t' \beta + \epsilon_t, \mathbb{V}[\mathbf{\epsilon}]=\Omega $$ The OLS estimates are $$ \hat\beta = (X'X)^{-1} X'Y = \beta + (X'X)^{-1} X'\mathbf{\epsilon} $$ and their variance is $$ \mathbb{V}[\hat\beta] = \mathbb{E} [ (X'X)^{-1} X'\mathbf{\epsilon}\mathbf{\epsilon}'X (X'X)^{-1} ] $$ Typically, at this stage, we'd have to assume something like existence of the probability limit of $\frac1T (X'X) \to \Sigma$, so that $$ T \mathbb{V}[\hat\beta] \to \Sigma^{-1} {\rm plim} \bigl[ \frac1T X'\mathbf{\epsilon}\mathbf{\epsilon}'X \bigr] \Sigma^{-1} = \Sigma^{-1} {\rm plim} \bigl[ \frac1T X'\Omega X \bigr] \Sigma^{-1} $$ This expression is different from what the naive OLS standard errors produce, and so in general the OLS standard errors are wrong.

Of course, if $X$ can be considered fixed, then there is no need for asymptotic approximations, and $X$ can be carried through the expectations, so that $$ \mathbb{V}[\hat\beta] = (X'X)^{-1} X'\Omega X (X'X)^{-1} $$ to the same effect.