Covariance Matrices – Measures of Similarity or Distance

covariance-matrixdistributionshypothesis testinginformation theorykullback-leibler

Are there any measures of similarity or distance between two symmetric covariance matrices (both having the same dimensions)?

I am thinking here of analogues to KL divergence of two probability distributions or the Euclidean distance between vectors except applied to matrices. I imagine there would be quite a few similarity measurements.

Ideally I would also like to test the null hypothesis that two covariance matrices are identical.

Best Answer

You can use any of the norms $\| A-B \|_p $ (see Wikipedia on a variety of norms; note that the square-root of the sum of squared distances, $\sqrt{\sum_{i,j} (a_{ij}-b_{ij})^2}$, is called Frobenius norm, and is different from $L_2$ norm, which is the square root of the largest eigenvalue of $(A-B)^2$, although of course they would generate the same topology). The K-L distance between the two normal distributions with the same means (say zero) and the two specific covariance matrices is also available in Wikipedia as $\frac12 [ \mbox{tr} (A^{-1}B) - \mbox{ln}( |B|/|A| ) ]$.

Edit: if one of the matrices is a model-implied matrix, and the other is the sample covariance matrix, then of course you can form a likelihood ratio test between the two. My personal favorite collection of such tests for simple structures is given in Rencher (2002) Methods of Multivariate Analysis. More advanced cases are covered in covariance structure modeling, on which a reasonable starting point is Bollen (1989) Structural Equations with Latent Variables.

Related Question