Solved – Covariance matrix: Meaning of the number of eigenvalues and vectors

covariancecovariance-matrixrandom variable

Let $X$ be a random variable with an $n\times n$ covariance matrix $A$.

Then $A$ is symmetric and positive semi-definite, and it has real and non-negative eigenvalues.

Also, if none of the components of $X$ is a linear combination of the other components, then $A$ is positive definite. Equivalently, it has only strictly positive eigenvalues. Equivalently, it has a positive determinant and is invertible.

My question is: What does the number of eigenvalues/eigenvectors of $A$ tell us about the covariance matrix $A$? What does it tell us about the components of the random variable $X$ that has covariance matrix $A$?

Best Answer

The number of eigenvalues itself tell us nothing that we did not know (probably) already, namely the dimensions of the original matrix $A$ (or the dimensions of system of equations represented by $A$ depends how we see things).

The number of unique eigenvalues can informative (eg. the number of unique solutions to the system of equation defined by the matrix $A$), the number of non-zero eigenvalues can be informative (eg. the rank of the matrix $A$), the number of positive or negative eigenvalues can be informative (eg. is $A$ non-negative, negative, etc.), their type being real or complex can be informative (eg. is $A$ symmetric - the eigenvalues of a symmetric matrix are always real), their magnitude compared to each other can be informative ( eg. the condition number of $A$, compactness of $A$'s spectrum, etc.) but actually their number.. not much really.

The only things I would draw extra attention on is that the number of eigenvalues is NOT the cardinality of the spectrum, which is the number of unique eigenvalues. In general, recall that the fundamental theorem of algebra guarantees that every polynomial of degree one or more has a possibly complex root and we are just solving the characteristic polynomial of $A$ such that: $p_A(\lambda)=|A- \lambda I|=0$ to get the eigenvalues. The fact got $n$ of them (with multiplicity) doesn't say much.

As whuber commented, please note that a single value might occur as a root of the characteristic equation $\nu$ times in which case we say that value has (algebraic) multiplicity $\nu$. To that extent, when we deal with a $n \times n$ matrix we have $n$-th degree polynomial and as such the sum of the multiplicities of distinct eigenvalues is $n$.

This thread on Eigenvalues are unique? in Math.SE is also informative on the matter.

Related Question