Kullback–Leibler vs Kolmogorov-Smirnov Distance – Comparing Distance Functions

distance-functionsdistributionskolmogorov-smirnov testkullback-leibler

I can see that there are a lot of formal differences between Kullback–Leibler vs Kolmogorov-Smirnov distance measures.
However, both are used to measure the distance between distributions.

  • Is there a typical situation where one should be used instead of the other?
  • What is the rationale to do so?

Best Answer

The KL-divergence is typically used in information-theoretic settings, or even Bayesian settings, to measure the information change between distributions before and after applying some inference, for example. It's not a distance in the typical (metric) sense, because of lack of symmetry and triangle inequality, and so it's used in places where the directionality is meaningful.

The KS-distance is typically used in the context of a non-parametric test. In fact, I've rarely seen it used as a generic "distance between distributions", where the $\ell_1$ distance, the Jensen-Shannon distance, and other distances are more common.