[Math] How is the entropy of the normal distribution derived

entropynormal distribution

Wikipedia says the entropy of the normal distribution is $\frac{1}2 \ln(2\pi e\sigma^2)$

I could not find any proof for that, though. I found some proofs that show that the maximum entropy resembles to $\frac{1}2+\ln(\sqrt{2\pi}\sigma)$ and while I see that this can be rewritten as $\frac{1}2\ln(e\sigma\sqrt{2\pi})$, I do not get how the square root can be get rid of and how the extra $\sigma$ can be put into $\ln$. It is clear that an additional summand $\frac{1}2\ln(\sigma\sqrt{2\pi})$ would help, but where do we get it from? Probably just thinking in the wrong way here…

So, what is the proof for the maximum likelihood entropy of the normal distribution?

Best Answer

Notice that $\ln(\color{blue}{\sqrt{\color{black}{x}}}) = \ln(x^{\color{blue}{\frac{1}{2}}}) = \color{blue}{\frac{1}{2}}\ln(x)$ and that $\ln(y) \color{red}{+} \ln(z) = \ln(y \color{red}{\cdot} z)$ for all $x,y,z > 0$. Using these identities, let us re-write the maximum entropy, $\frac{1}{2} + \ln(\sqrt{2\pi}\sigma)$, as follows: $$ \begin{align} \frac{1}{2} + \ln(\sqrt{2\pi}\sigma) &= \frac{1}{2} + \ln(\color{blue}{\sqrt{\color{black}{2\pi\sigma^2}}}) \\ &= \frac{1}{2} + \color{blue}{\frac{1}{2}}\ln(2\pi\sigma^2) \\ &= \frac{1}{2}(1 + \ln(2\pi\sigma^2)) \\ &= \frac{1}{2}(\ln(\mathrm{e}) \color{red}{+} \ln(2\pi\sigma^2)) = \frac{1}{2}\ln(\mathrm{e}\color{red}{\cdot}2\pi\sigma^2) \end{align} $$ So, the entropy reported in Wikipedia is correct.

Related Question