Time Series Analysis – How to Measure the Goodness of Fit of a GARCH Model

garchgoodness of fittime series

When we talk about the linear regression, we have $R^2$ to measure the goodness of fit of the linear model.

Here is the problem, do we have a similar statistical measure to assess the goodness of fit of a GARCH model for the raw data?

Best Answer

A GARCH model assumes a perfect fit for the conditional variance equation. This feature is due the definition/construction of the GARCH model (note that there is no error term in the conditional variance equation in a GARCH model).

A class of conditional variance models that allows for imperfect fit are stochastic volatility models.

In any case, measuring the goodness of fit of conditional variance models is problematic -- because the conditional variance is unobserved. Thus conventional techniques such as running a regression of the dependent variable (the conditional variance) on the regressors (e.g. lagged conditional variances and lagged squared error terms) and taking the $R^2$ as a measure of fit do not work.

You can assess how "good" a GARCH model is by looking at model residuals and checking how well they match the model assumptions. Normally you assume the residuals to be i.i.d. You also assume they follow a certain distribution, e.g. a normal distribution. You use this assumption in constructing the likelihood function which is then used for fitting the model via the maximum likelihood estimation (MLE). Thus you could look at the residuals and see how close they are to being i.i.d. and following the assumed distribution.

Aksakal suggested looking at AIC or BIC which gives you model likelihood adjusted for the number of parameters (so as to penalize for overfitting). Looking at model likelihood itself can also be meaningful, but then you have to keep in mind that richer models normally yield higher likelihoods.

Related Question