Repeated roots of $\det(A-xI)$

eigenvalues-eigenvectorslinear algebramatricesmatrix-rank

The solutions of the equation $\det(A-xI)=0$ give rise to the eigenvalues of Matrix $A$. This confuses me because the equation can give repeated roots. Suppose there is a root $x=\lambda$ of multiplicity $n$, and the corresponding eigenvectors are $\{v_1,v_2,…,v_n\}$. Then any linear combination of these vectors is an eigenvector, because $A(v_1+v_2)=Av_1+Av_2=\lambda v_1+\lambda v_2=\lambda(v_1+v_2)$ .

What I find puzzling is that those $n$ vectors doesn't have to be linear independent. For example, the matrix $\begin{bmatrix}1&0\\-4&1\end {bmatrix}$ has a repeated eigenvalue $\lambda=1$ of multiplicity 2, but the corresponding vector space of eigenvectors is only one dimensional, spanned by $(0,1)^T$. This lead me to the following questions.

If a non-singular matrix $A$ satisfies $\operatorname{rank}(A-\lambda I)=\operatorname{rank}(A)-n$, does the equation $\det(A-xI)=0$ always have a repeated root of multiplicity greater than or equal to $n$ at $x=\lambda$?

Under what conditions are those $\{v_1,v_2,…,v_n\}$ above linearly independent?

Best Answer

This is not a full answer. It's just a collection of thoughts related to your predicament. It is, however, a bit long for a comment.

What you have here is an example of an eigenvalue whose geometric multiplicity (the dimension of the corresponding eigenspace) is strictly smaller than the algebraic multiplicity (the degree of the root of the characteristic polynomial).

One concept invented to resolve this is generalized eigenvectors, which are vectors $v$ where $(A-\lambda I)v$ isn't necessarily $0$, but $(A-\lambda I)^nv=0$ for some natural number $n$. And you can always find enough generalized eigenvectors for a given eigenvalue to make up for the discrepancy you noted here. In your case, any vector in the plane is a generalized eigenvector corresponding to the eigenvalue $1$, which suits the degree of the root $\lambda=1$ in your characteristic polynomial.

If all the eigenvalues does have the "correct" dimension of all its eigenspaces, then the matrix and the linear transformation it represents is diagonalizable. If there is some eigenspace which is "too small", then it is not diagonalizable. The resolution here, related to the concept of generalized eigenvectors above, is the so-called Jordan normal form, which says that when transforming $A$ to some specific basis of generalized eigenvectors, the resulting matrix is almost diagonal. It will have the eigenvalues in the diagonal, and $1$'s in some places along the superdiagonal (and otherwise $0$).

Related Question