Solved – How to interpret an inverse covariance or precision matrix

covariance-matrixinterpretation

I was wondering whether anyone could point me to some references that discuss the interpretation of the elements of the inverse covariance matrix, also known as the concentration matrix or the precision matrix.

I have access to Cox and Wermuth's Multivariate Dependencies, but what I'm looking for is an interpretation of each element in the inverse matrix. Wikipedia states: "The elements of the precision matrix have an interpretation in terms of partial correlations and partial variances," which leads me to this page. Is there an interpretation without using linear regression? IE, in terms of covariances or geometry?

Best Answer

There are basically two things to be said. The first is that if you look at the density for the multivariate normal distribution (with mean 0 here) it is proportional to $$\exp\left(-\frac{1}{2}x^T P x\right)$$ where $P = \Sigma^{-1}$ is the inverse of the covariance matrix, also called the precision. This matrix is positive definite and defines via $$(x,y) \mapsto x^T P y$$ an inner product on $\mathbb{R}^p$. The resulting geometry, which gives specific meaning to the concept of orthogonality and defines a norm related to the normal distribution, is important, and to understand, for instance, the geometric content of LDA you need to view things in the light of the geometry given by $P$.

The other thing to be said is that the partial correlations can be read of directly from $P$, see here. The same Wikipedia page gives that the partial correlations, and thus the entries of $P$, have a geometrical interpretation in terms of cosine to an angle. What is, perhaps, more important in the context of partial correlations is that the partial correlation between $X_i$ and $X_j$ is 0 if and only if entry $i,j$ in $P$ is zero. For the normal distribution the variables $X_i$ and $X_j$ are then conditionally independent given all other variables. This is what Steffens book, that I referred to in the comment above, is all about. Conditional independence and Graphical models. It has a fairly complete treatment of the normal distribution, but it may not be that easy to follow.

Related Question