Regression Analysis – Regression with I(0) Dependent Variable and I(1) Independent Variable

regressionstationaritytime series

I am running time-series regressions where the dependent variable is integrated $I(0)$ and the independent variable is $I(1)$. I do not want to run regressions with a differenced independent variable because it does not properly address the research question. How can I adjust the regression to allow me to use an independent variable that is $I(0)$?

I saw some information that you can use an ARDL model to fix this, but the resources did not make much sense.

Here is a general layout of the regression models I seek to use:

a) $y_t = \alpha + \beta_1x_t+\epsilon_t$
b) $y_t = \alpha + \beta_1x_t + \beta_2Dummy_t +\epsilon_t$
c) $y_t = \alpha + \beta_1x_t + \beta_2Dummy_t + \beta_3Dummy_t*x_t +\epsilon_t$

Thank you.

Best Answer

If $y_t\sim I(0)$ and $x_t\sim I(1)$, then you have to difference $x_t$. If you do not, the right hand side of the model equation will diverge from the left hand side.

Using distributed lag of $x_t$ instead is technically possible but not as transparent. The only combinations of lags of $x_t$ that work as right-hand-side variables in your model are stationary combinations, and they could be expressed in a more transparent way in terms of $\Delta x_t$ and possibly its lags.

Related Question