Time Series GARCH Model – Convergence of Conditional Variance to Unconditional Variance

garchtime series

Suppose a monthly, stationary time series. The series seems to have some ARCH effects and I model its variance as a GARCH process. I obtain the following output of a GARCH(1,1) model:

  • alpha ($\alpha$): 0,07
  • beta ($\beta$): 0,7
  • intercept ($c$): 0,00008

I obtain the unconditional variance of the GARCH(1,1) model by taking the expectation of the GARCH equation:

$$ E(V_t) = E(c + \alpha\epsilon^2_t+\beta V_{t-1}) $$
$$ V = c + \alpha V + \beta V $$
$$ V = c/(1-\alpha – \beta) $$

But now I would like to assess how many months would it take to a regular shock (e.g., one standard deviation?) to dissipate or the conditional variance converge into its unconditional variance. How would someone do this?

Best Answer

Due to the nature of the autoregressive components in GARCH, the shocks dissipate at an exponential rate and never fully.* So to answer your question, for a shock to dissipate fully would take an infinitely long time, and the long-run variance would never be reached, only approched however closely.

Therefore, people study half-life instead: $$ \ell:=\frac{\ln(0.5)}{\ln(\sum_{i=1}^s \alpha_i+\sum_{\beta_j}^r \beta_j)} $$ where $\alpha$s and $\beta$s are the GARCH model coefficients. In the case of a GARCH(1,1), $$ \ell=\frac{\ln(0.5)}{\ln(\alpha_1+\beta_1)}. $$ Half-life tells you after how many periods half of the shock has dissipated.

In your case, $\ell=\frac{\ln(0.5)}{\ln(0.07+0.70)}\approx 2.65$.

*This is in contrast to an ARCH(s) model where shocks completely dissipate after $s$ periods.

Related Question