[Math] Proof that the determinant of a Covariance matrix is equal to the determinant of the corresponding correlation matrix times the product of variances

determinanthadamard-productlinear algebramatricesstatistics

If $\Sigma\in \mathbb{R}^n$ is a positive-definite covariance matrix with corresponding vector of variances $v = diag(\Sigma)$ and standard deviations $s = \sqrt{v}$, then the corresponding correlation matrix will be $$R = \Sigma / (s s^T)$$, where division is done element-wise here.

I've verified through numerical simulation that $$\det(\Sigma) = \det(R)\cdot\prod_i v_i$$, but I'm having trouble proving this. The element-wise division that relates $\Sigma$ and $R$ doesn't really play nice with diagonalizing to analyze the product of eigenvalues (aka the determinant).

Any thoughts?

Best Answer

I think I figured out a way to prove it.

Let $S = Diag(s)$. Then it's not too hard to see that $\Sigma = S R S$. By taking determinants on both sides, we get \begin{align} \det(\Sigma) &= \det(S R S) \\ &= \det(S) \cdot \det(R) \cdot \det(S) \\ &= \det(R) \prod_i v_i \end{align}

Here, we used the facts that determinants or a product can be split into products of determinants, and the determinant of a diagonal matrix is just the product of its diagonal entries.