Solved – Understanding KL divergence between two univariate Gaussian distributions

kullback-leiblernormal distribution

I'm trying to understand KL divergence from this post on SE. I am following @ocram's answer, I understand the following :

$\int \left[\log( p(x)) – log( q(x)) \right] p(x) dx$

$=\int \left[ -\frac{1}{2} \log(2\pi) – \log(\sigma_1) – \frac{1}{2} \left(\frac{x-\mu_1}{\sigma_1}\right)^2 + \frac{1}{2}\log(2\pi) + \log(\sigma_2) + \frac{1}{2} \left(\frac{x-\mu_2}{\sigma_2}\right)^2 \right] \times \frac{1}{\sqrt{2\pi}\sigma_1} \exp\left[-\frac{1}{2}\left(\frac{x-\mu_1}{\sigma_1}\right)^2\right] dx$

$=\int \left\{\log\left(\frac{\sigma_2}{\sigma_1}\right) + \frac{1}{2} \left[ \left(\frac{x-\mu_2}{\sigma_2}\right)^2 – \left(\frac{x-\mu_1}{\sigma_1}\right)^2 \right] \right\} \times \frac{1}{\sqrt{2\pi}\sigma_1} \exp\left[-\frac{1}{2}\left(\frac{x-\mu_1}{\sigma_1}\right)^2\right] dx$

But not the following:

$=E_{1} \left\{\log\left(\frac{\sigma_2}{\sigma_1}\right) + \frac{1}{2} \left[ \left(\frac{x-\mu_2}{\sigma_2}\right)^2 – \left(\frac{x-\mu_1}{\sigma_1}\right)^2 \right]\right\}$

$=\log\left(\frac{\sigma_2}{\sigma_1}\right) + \frac{1}{2\sigma_2^2} E_1 \left\{(X-\mu_2)^2\right\} – \frac{1}{2\sigma_1^2} E_1 \left\{(X-\mu_1)^2\right\}$

$=\log\left(\frac{\sigma_2}{\sigma_1}\right) + \frac{1}{2\sigma_2^2} E_1 \left\{(X-\mu_2)^2\right\} – \frac{1}{2}$

Now noting :
$(X – \mu_2)^2 = (X-\mu_1+\mu_1-\mu_2)^2 = (X-\mu_1)^2 + 2(X-\mu_1)(\mu_1-\mu_2) + (\mu_1-\mu_2)^2$

$=\log\left(\frac{\sigma_2}{\sigma_1}\right) + \frac{1}{2\sigma_2^2}
\left[E_1\left\{(X-\mu_1)^2\right\} + 2(\mu_1-\mu_2)E_1\left\{X-\mu_1\right\} + (\mu_1-\mu_2)^2\right] – \frac{1}{2}$

$=\log\left(\frac{\sigma_2}{\sigma_1}\right) + \frac{\sigma_1^2 + (\mu_1-\mu_2)^2}{2\sigma_2^2} – \frac{1}{2}$

First off what is $E_1$?

Best Answer

$E_1$ is the expectation with respect to the first distribution $p(x)$. Denoting it with $E_p$ would be better, I think. – Monotros


I've created this answer from a comment so that this question is answered. Better to have a short answer than no answer at all.