Solved – Expectation of log likelihood ratio

asymptoticsexpected valueintegrallikelihood-ratio

Given that $X_{1},…,X_{n}$ are i.i.d random variables with joint distribution $f(x\mid \theta) $ with 1 dimensional parameter $\theta$, let $\hat\theta$ be the maximum likelihood estimator of $\theta$.

Based on the Wilks theorem,under null hypothesis that $H_{0}: \theta=\theta_{0}$, one has that $$-2log(f(x\mid \theta_{0})/f(x\mid \hat\theta))\rightarrow \chi_{1}^{2} \space \text{ as } \space n\rightarrow\infty.$$

Since we know that $E\chi_{1}^{2}=1$, is there anyway I can find the value of the following integral (expectation of log likelihood ratio) or at least asymptotics

$$-2\int f(x\mid \theta_{0})log(\frac{f(x\mid \theta_{0})}{f(x\mid \hat\theta)})dx=?$$

My guess is that the integral should be around 1. Can anyone please share some ideas or references concerning the above integral?

Best Answer

The integral in question is in fact the Kullback-Leibler divergance between $F(\cdot,θ_0)$ and $F(\cdot,\hat{θ})$. You cannot generally say that it converges to anything (especially considering the fact that you parametric assumption may be wrong). However, for certain distribution families it has good estimates.

You might want to look at Vladimir Spokoiny publications, who has spent quite a lot of time working on this sort of problems. For instance, you might consider useful this article about exponential family distributions (2005).

Related Question