I am looking at some time series regressions,
When I use OLS, my standard errors on one of the coefficients is 0.002777409.
When I use the Newey-West correction the standard errors are smaller:
OLS: 0.002777409.
1 Lag: 0.001995232
3 Lags: 0.001902868
5 Lags: 0.001859962
7 Lags: 0.001834708
9 Lags: 0.001777297
Why is this possible and why do they decrease with lag length!
Best Answer
One possible explanation is that in the scores corresponding to your OLS estimates there are also some negative autocorrelations. These would cause the standard errors to be smaller than under independence.
Consider the following very basic artificial example in R using the
sandwich
package for Newey-West standard errors: Under independence the conventional standard error and the Newey-West standard error are very close.However, if we create a time series with positive autocorrelation (here AR1 with 0.8) the conventional standard error is too small and the Newey-West standard error is much larger.
Conversely, if we create a series with negative autocorrelation (here AR1 with -0.8), then the conventional standard error is too large and the Newey-West standard error much smaller.
In a more complex regression model the situation may not be that simple and the effects might also not be the same for all regression coefficients. However, it is not completely unusual to get Newey-West standard errors that are smaller than the conventional standard errors.