Solved – Kullback-Leibler divergence and probability distribution function in MATLAB

kullback-leiblerMATLAB

I want to compute the Kullback-Leibler divergence (KL) of two Gaussians, the first with mean of 1 and the second -1, where both have the same variance say, 1.

In MATLAB, the distributions are:

y1 = normpdf([-10:0.1:10], -1, 1)
y2 = normpdf([-10:0.1:10],  1, 1)

The code I used to compute the KL is:

KL = sum(Apdf .* (log2(Apdf)-log2(Bpdf))) 

Are these the inputs I should use for the KL? The result I got is 28, shouldn't it be 2?

Best Answer

There are two reasons why you did not get the answer 2.

1) The KL divergence being 2 is based on use of the natural log, which in MATLAB is log.

2) If you used log instead of log2 in your code, you would get the result 20. The reason is that in performing the integration, you neglected to multiply by the discretization increment between points, which in your calculation was 0.1.

Here is a correct solution:

>> Apdf = normpdf([-10:0.1:10], -1, 1);
>> Bpdf = normpdf([-10:0.1:10],  1, 1);
>> KL = 0.1 * sum(Apdf .* (log(Apdf)-log(Bpdf)))
KL =
   2.000000000000000