Solved – How to remove cyclical trend in residuals for time series regression

autocorrelationregressionresidualsseasonalitytime series

I have modeled a stationary time series with another related stationary time series. I'm having a problem with cyclicality in the residuals and I don't know how to fix it.

Here is my model:

$\text{TS}_1(t) = \beta_0 + \beta_1\cdot \text{TS}_2(t) + \epsilon$

Here is a chart of the residuals:

Residuals

There is clearly a strong trend. I tried to take out the trend by adding a 15 day lagged variable. The new residuals are looking a lot better, but there still looks like there is some kind of trend or abnormality (they don't look random to me).

Here is the model with the Lag:

$\text{TS}_1(t) = \beta_0 + \beta_1\cdot \text{TS}_2(t) + \beta_2\cdot(\text{TS}_2(t-15)) + \epsilon$

Residual_Lagged_model

I haven't ever done anything like this before. I know adding a lagged variable in AR models can remove seasonality, but I don't know if that applies to the errors on a time series regressed on a different time series.

Is adding a lagged variable to the model the appropriate way of removing trends in the residuals? What tests can I run (other than just looking at the chart) to decide of the trend is still an issue? I ran the Durbin-Watson test (both models failed), but I don't know if the test applies when modeling one time series from another.

Best Answer

Try adding a moving average MA(q) term, it looks like a spike upwards in your errors is followed by a sharp drop in your errors. An MA(1) would add the term $\theta$ε$_{t-1}$ which would factor in the error from the day before. This might smooth out large movements in your trend.

AR(1),MA(q): TS$_{t}$ = β$_{0}$ + β$_{1}$*TS$_{t-1}$ + β$_{2}$*TS$_{t-15}$ + $\theta_{1}$ε$_{t-1}$+ ...+ $\theta_{q}$ε$_{t-q}$ + ε

Related Question