Solved – Inverse of Covariance Matrix

covariance

I'm working on some code to find the inverse of a covariance matrix of arbitrary dimension. Obviously, the matrix will always be symmetric, and I think it's that fact that I need to take advantage of to make this easier, but am failing. Any advice other than brute force elementary operations and checks?

Thanks

Best Answer

The symmetric matrix $A$ taken to the power of $\alpha$ can be computed by using spectral decomposition, i. e. $\text A^{\alpha} = \Gamma \Lambda^{\alpha} \Gamma'$. With $\Gamma$ being the matrix with the normalized eigenvectors of $A$ and $\Lambda$ being a diagonal matrix with the eigenvalues of $A$. Hence, the inverse of $A$ is computed by setting $\alpha=-1$. Now, one gets the inverse of the diagonal matrix $\Lambda$ by simply taking the inverse of every element of the diagonal matrix, i. e. $\Lambda^{-1} = diag(1/\lambda_1,\dots,1/\lambda_p) $.

An example for an R-Code looks like this:

A <- cov(X)    #Covariance of X
E <- eigen(A)  #Eigenvectors and -values of A
Lambda <- diag(E$values^-1)    #Diagonal matrix with inverse of eigenvalues
Gamma <- E$vectors  #Eigenvectors
A_Inverse <- Gamma%*%Lambda%*%t(Gamma)  #compute the inverse
round(A_Inverse%*%A,5)  #check
Related Question