Solved – Symmetric Kullback-Leibler divergence OR Mutual Information as a metric of distance between two distributions

kullback-leiblermutual information

I need some metric of divergence of two distributions.
(They are complex and don't fit with exponential family, normal, log-normal, power-law. Maybe some mixture of that, but I'm not feeling right now figuring that out.)

I'm thinking between Kullback-Leibler divergence and Mutual Information. I don't have any default distribution, therefore I don't like DKL to be asymmetric (Of course I can symmetrize it, but I don't feel confident about it.)

Which one would you choose?

Any other ideas are also welcome!

Best Answer

As far as I can understand you are solving the following problem: there are two analytical distributions $p(x)$ and $q(x)$, and you want to calculate distance between them, $D(p, q)$.

There are a plenty of measures of distance between two distributions:

I suggest you to try a few from the list above as all of them are rather easy to implement. In most applications numerical experiments is what give you a key to success. Then you can select one that suits you the best (as I haven't found any requirements for this distance in your question I can't suggest you anything else).

As $p(x)$ and $q(x)$ are complex it is almost impossible that there exists analytical expression for some $D(p, q)$, so you will need a numerical way to calculate those distances. Note, that calculation almost all distances involves numerical integration - so they will be rather imprecise if $x$ dimension is high.