[Math] Kullback-Liebler divergence

probability

The Kullback-Liebler divergence between two distributions with pdfs $f(x)$ and $g(x)$ is defined
by
$$\mathrm{KL}(F;G) = \int_{-\infty}^{\infty} \ln \left(\frac{f(x)}{g(x)}\right)f(x)\,dx$$

Compute the Kullback-Lieber divergence when $F$ is the standard normal distribution and $G$
is the normal distribution with mean $\mu$ and variance $1$. For what value of $\mu$ is the divergence
minimized?

I was never instructed on this kind of divergence so I am a bit lost on how to solve this kind of integral. I get that I can simplify my two normal equations in the natural log but my guess is that I should wait until after I take the integral. Any help is appreciated.

Best Answer

I cannot comment (not enough reputation).

Vincent: You have the wrong pdf for $g(x)$, you have a normal distribution with mean 1 and variance 1, not mean $\mu$.

Hint: You don't need to solve any integrals. You should be able to write this as pdf's and their expected values, so you never need to integrate.

Outline: Firstly, $ \log({f(x) \over g(x) })=\left\{ -{1 \over 2} \left( x^2 - (x-\mu )^2 \right) \right\} $ . Expand and simplify. Don't even write out the other $f(x)$ and see where that takes you.

Related Question