Solved – Negative binomial – likelihood ratio and likelihood ratio chi-squared

goodness of fitlikelihood-rationegative-binomial-distribution

I ran a hierarchical negative binomial regression analysis, and got information relative to the log likelihood and likelihood ratio chi-squared in the output. I have the following questions regarding the goodness of fit:

  1. What model is preferred? I heard the one with the log likelihood closer to zero in absolute form; is this correct?
  2. Is the Likelihood ratio chi-squared the same as what some call the Wald chi-squared, to test the overall model, compared to previous ones? (i.e. null model)
  3. Are there any "typical" values for the Log likelihood and the likelihood ratio chi-squared? I got these results for my 3 models:

$$ \begin{array}{|c|c|c|c|} \hline & \mbox{Model 1} & \mbox{Model 2} & \mbox{Model 3} \\ \hline \mbox{Log likelihood} & -3517.05 & -3278.52 & -3275.0 \\ \mbox{Likelihood ratio chi-squared} & 3727.3 & 4331.07 & 4338.3 \\ \hline \end{array}
$$

Best Answer

  1. In general, a higher value of the likelihood (or log-likelihood) is preferred. The relationship to the value of 0 is irrelevant. However, there are a number of caveats: if you add more parameters (predictors) to the model, the likelihood can only go up, so selecting the maximum value would always give the model with the most variables. There are multiple ways around it via various information criteria, that penalize the model by its number of parameters. Look up the Akaike Information Criterion (AIC) or the Bayesian Information Criterion (BIC).
  2. I don't know exactly what the "likelihood ratio chi-squared" is, but it could be the value of the likelihood-ratio test statistic compared to the null model (not the other competing models). It is unlikely to be the Wald or the score test statistic. However, if the competing models are nested (one is a special case of the other) you can use the log-likelihood values to compute the likelihood-ratio test statistics of one model versus the other. $$ \mathrm{LRT} = -2 *( \log L(M_1) - \log L(M_2)) \sim \chi^2_{p_2-p_1}$$ under $H_0$, where $p_1$ and $p_2$ are the number of parameters
  3. There are no "typical" values for the value of the log-likelihood, just as there are no typical values for the sum of all the y values in your data. They are only meaningful when comparing different fits to the same set of values for the outcome.
Related Question