Solved – trust a regression if variables are autocorrelated

autocorrelationregressiontime series

Both variables (dependent and independent) show autocorrelation effects. Data is time-series and stationary

When I run the regression residuals appear not to be correlated.
My Durbin-Watson statistic is greater than upper critical value, so there is an evidence that error terms are not positively correlated. Also when I plot ACF for errors it looks like there is no correlation there and Ljung-Box statistic is smaller than critical value.

Can I trust my regression output, are the t-statistics reliable?

Best Answer

The t-statistics are reliable in the absence of autocorrelation of the errors. The fact that the residuals don't display significant autocorrelation indicates, in a not terribly rigorous way, that the autocorrelation in your dependent variable is due to the autocorrelation in your independent variable. However, it's also important to remember that the difference between statistical significance and insignificance is not itself statistically significant in many cases, e.g., a t-statistic of 1.8 vs. a t-statistic of 2.8 is a difference of 1.0, hence the lack of rigor in the statement above.

An alternative approach would be to model the data using time series analysis techniques, which, for R, are very briefly described in CRAN task view: Time Series Analysis. These techniques can get you sharper parameter estimates by explicitly modeling cross-time correlation structures, whereas, if you don't model them explicitly, you are implicitly assuming that the only such structure in the data is due to the independent variable.