Can an inverse of a covariance matrix be computed faster that inverse of an arbitrary matrix

covariancelinear algebramatricesmatrix decomposition

In my computer program I need to find inverse of covariance matrix. For that I use functionality of numpy (np.linalg.inv) and these calculations are too slow for my purposes.

I wonder if the search for an inverse matrix can be speed up if we use special properties of the matrix.

First, we know that the matrix that we try to inverse is a covariance matrix so it is always symmetric positive semi-definite.

Second, in my case, it is represented by a product of smaller matrices:

C = np.dot(np.transpose(B), B) + np.diag(G)

Meaning that the covariance matrix is given by a product of a smaller matrix with its transpose plus a diagonal matrix.

Can I use these properties to find the inverse quicker?

Best Answer

I suspect that using the Cholesky decomposition would make things more efficient. Note that if $C = LL^T,$ then we have $C^{-1} = L^{-T}L^{-1}$. So, I recommend you calculate the inverse as follows:

P = np.linalg.inv(np.linalg.cholesky(C))
C_inv = np.dot(np.transpose(P),P)

Actually, I'm not sure the built-in inverse algorithm takes advantage of the triangular structure, so you might need to use something like solve_triangular instead.

You might also find that

C_inv = scipy.linalg.pinvh(C)

Performs well.

Related Question