[Math] Kullback Leibler “variance”: does that divergence have a name

fa.functional-analysispr.probabilityreference-requestst.statistics

If you consider two probability distributions $p$ and $q$, one way to measure the distance between the two is the Kullback-Leibler divergence:

$$KL(p,q)=\int p \log (p/q) = E_p(\log p/q)$$

and this has many good properties.

I'm currently writing an article in which I want to use what I call the KL variance:

$$KL_{var}(p,q) = var_p(\log p/q) = \int p \log^2 (p/q) – KL(p,q)^2$$

Which also has many good properties

I have searched around quite a bit for references to this divergence, and I haven't found anything. Does anybody have an existing reference to this divergence ? Are there any names which would be slightly more catchy than KL-variance?

Best Answer

Too late to help you I guess but I'll leave this here for future reference. This quantity shows up a lot in Bayesian nonparametrics, when proving (frequentist) posterior contraction rates, so "posterior contraction rates" is a useful search. I don't know of any snappy names for it, but theorem 8.3 (and the comment afterwards) in Ghosal, Ghosh, van der Vaart, Convergence rates of posterior distributions, 2000, gives the following bound:

$$var_p \log(p/q) \leq 4h^2(p,q) \lVert{p/q}\rVert_\infty,$$ where $h$ is the Hellinger distance $h(p,q)^2=\int (\sqrt{p}-\sqrt{q})^2$:

https://projecteuclid.org/euclid.aos/1016218228