R – Optimal Lag Length in VECM Using Vars R Package

rvector-autoregressionvector-error-correction-model

I have some series that are cointegrated, so I know that I should fit a vector error correction model (VECM). Nevertheless I found no guidance in finding the optimal lag length, say lagLength.

I am using the "vars" package in R. In order to check cointegration I used ca.jo(..,K=cointegrationLength) and then used cajoorls(...,K=lagLength) to fit the VECM.

I do not understand:

  1. the right interaction between K in both functions;
  2. an "optimal" criterion to choose the lag lengths.

Best Answer

For VEC models you should select number of lags based on information criteria on VAR model on levels of your time series. For that you can use function VARselect from the same package vars.

Function cajorls does not have the argument $K$. However it has the argument $r$, which denotes cointegration rank. Argument $K$ in function ca.jo controls the number of lags of VEC model.

The usual workflow for estimating VEC model is the following (rough outline). Suppose your time series is in the matrix y.

  1. Find the number of lags using VARselect(y)

  2. Determine the cointegration rank using the function ca.jo. Pass the number of lags found in the first step as argument K.

  3. Fit VEC model using the cointegration vectors determined from the second step. This is performed by function cajorls, where you should pass the result of ca.jo and the number of cointegration vectors.