[Math] Using Cholesky decomposition to compute covariance matrix determinant

covariancedeterminantmatrix decomposition

I am reading through this paper to try and code the model myself. The specifics of the paper don't matter, however in the authors matlab code I noticed they use a Cholesky decomposition instead of computing the determinant of a covariance matrix directly.

Specifically, the author has

$$\log\det(\Sigma) = 2 \sum_i \log [ diag(L)_i ]$$

where $L$ is the lower triangular matrix produced by a Cholesky decomposition of the covariance $\Sigma$.

I tested this out myself using various covariance matrices and found the relation above always works to within 14 decimal places (it's probably just a machine precision issue). I believe the author uses a cholesky decomposition because it is slightly faster to compute than computing the determinant directly (at least when I timed it on my machine).

My question is, why does this relation hold true? I couldn't find any references in the paper / code, or any material online.

Best Answer

Depending on the size of your matrix $\Sigma$ you probably shouldn't have been able to compute the determinant in any reasonable amount of the time. The time complexity is $\mathcal{O}(n!)$ where as when using the LU it is $\mathcal{O}(n^{3})$.

The reason it works is because of the following property of determinants.

$$ \Sigma = LL^{T} \implies \det(\Sigma) = \det(LL^{T}) = \det(L)\det(L^{T}) $$

For a triangular matrix, the determinant is the product of the diagonal.

$$ \det(L) = \prod_{i=1}^{n} L_{ii} $$

which means if you take the $\log$ then you get a sum.

$$ \log(\prod_{i=1}^{n} L_{ii}) = \sum_{i=1}^{n} \log(L_{ii}) $$

this follows from a basic property of logs. The $2$ there is because of the property of determinants. The determinant of the transpose of the matrix is the same as the determinant of the matrix.

$$ \det(A^{T}) = \det(A) $$

That means that when we take the log the power of $2$ comes down

$$ \det(\Sigma) = \det(LL^{T}) \\ \implies \det(L)\det(L^{T}) = \det(L)^{2} \\ \implies \log((\prod_{i=1}^{n} L_{ii})^{2}) = 2 \log(\prod_{i=1}^{n} L_{ii}) = 2 \sum_{i=1}^{n} \log(L_{ii}) $$