I am fitting an Error Correction Model with two monthly price time series.
In Stata I am using the varsoc
command to determine the number of lags that are appropriate.
varsoc variable1 variable2
If I run varsoc
with the default 4 maxlags
, the suggested lag length using AIC/FPE is 3.
However, if I run varsoc with maxlag(12)
option, 12 lags are suggested.
If I use maxlag(20)
, 13 lags are suggested.
- Why is
varsoc
so sensitive to the number ofmaxlag
? - If this is the case, how should I decide which
maxlag
to use? Runningvarsoc
on the time series seperately yields 8 lags (if a largermaxlag
is chosen).
Best Answer
The function
varsoc
considers full (unrestricted) VAR or VECM models with lags 1 throughmaxlag
. It may happen that the user setsmaxlag
to a relatively low number such that the best model in AIC or FPE sense is excluded. This seems to be happening formaxlag(4)
andmaxlag(12)
. However, I suspect the AIC- or FPE-suggested lag order would not grow infinitely. It might just be the case that a VAR(13) model fits the data relatively better than a VAR(3) or VAR(12), even after penalizing the number of parameters in the model. You should also not be surprised that a VAR(3) is selected rather than a VAR(4) when you setmaxlag(4)
. AIC and FPE need not decrease monotonically until the "optimal" lag and increase monotonically after that.If you think that a VAR(13) is nonsensical and the lag order is too high, then perhaps you have excluded a relevant variable or neglected seasonality or a structural change, or the model suffers from yet another fault.