Solved – How to derive the conditional likelihood for a AR-GARCH model

garchmaximum likelihoodtime series

Let us start from AR(1)-GARCH(1,1) model,

$r_t=\phi_0+\phi_1r_{t-1}+a_t$

$a_t=\epsilon_t\sigma_t$

$\sigma_t^2=\alpha_0+\alpha_1a_{t-1}^2+\beta_1\sigma_{t-1}^2$

where {$\epsilon_t$} is Gaussian white noise series with mean 0 and variance 1.

Assume that we can observe the returns $r_1,r_2,…,r_T$, then how to derive the conditional likelihood? Many books have no details about it.

The first question:
I think the parameters we need to estimate include $\phi_0,\phi_1,\alpha_0,\alpha_1,\beta_1$. The conditional distribution of $r_t$ given $r_1,r_2,…,r_{t-1}$ is $Normal(\phi_0+\phi_1r_{t-1},\sigma_t^2)$. By multiplying them together we can estimate $\phi_0,\phi_1$ and calculate all $\sigma_t$ and $a_t$. But how should we go a step further to estimate $\alpha_0,\alpha_1,\beta_1$ by MLE. Given the observation up to time $t-1$, $\sigma_t$ is already measurable without any randomness.

The second question:
Do we have a method that can derive the "joint" likelihood of all 5 parameters in one step, unlike the above one?

Best Answer

You can follow the steps in this question. The only thing you need to remember is that

$\varepsilon_t = r_t - \phi_0 - \phi_1 r_{t-1}$.

You are very close to the answer - using the fact that

$r_t \vert r_{t-1},...,r_{1} \sim N(\phi_0 + \phi_1 r_{t-1}, \sigma_t^2)$

you are basically done.

Related Question