Two closed-form analytical solutions for multivariate Gaussian entropy: How are they equal

covarianceentropygaussianinformation theorystatistics

Entropy of a multivariate Normal distribution is well-known to be

$$H(\boldsymbol{X}) = \frac{N}{2}\ln2\pi e\left(\prod_{i=1}^{N}\sigma_i^2\right)^\frac{1}{N}$$

but here, it is also shown to equal

$$H(\boldsymbol{X}) =\frac{N}{2}\ln\left(2\pi e\right)+\frac{1}{2}\ln\det(\Sigma)$$
where $\Sigma$ is the sample covariance matrix.

How to show that the two formulas equate to each other? and which is more of the complete simplification of multivariate Gaussian entropy?

Best Answer

I assume that $X=AZ$ where $A$ is a constant matrix and $Z$ is standard Normal distributed, $Z\sim\mathcal{N}(0,I)$. In this case, the covariance of $X$ is $\Sigma$. It results: $$\det\Sigma=\sigma_1^2(A) \sigma_2^2(A) \cdots \sigma_n^2(A)=\det(AA^T).$$

The derivation is as follows:

\begin{align} h(X) & = \frac{N}{2}\log \left(2\pi e\left(\prod_{i=1}^{N}\sigma_i^2\right)^\frac{1}{N}\right)\\ & = \frac{N}{2}\log \left(2\pi e\right) + \frac{N}{2}\log \left(\left(\prod_{i=1}^{N}\sigma_i^2\right)^\frac{1}{N}\right)\\ & = \frac{N}{2}\log \left(2\pi e\right) + \frac{1}{2}\log \left(\prod_{i=1}^{N}\sigma_i^2\right)\\ & = \frac{N}{2}\log \left(2\pi e\right) + \frac{1}{2}\log \det\Sigma. \end{align}

It is more common to write the second one. It is even more common to put the first term inside the second one as follows: $$ h(X) = \frac{1}{2}\log\det(2\pi e\Sigma). $$

Related Question