[Math] Orthogonal eigenvectors in symmetrical matrices with repeated eigenvalues and diagonalization

linear algebra

Symmetrical matrices have orthogonal eigenvectors. However, there is the special case when eigenvalues are repeated. The ultimate scenario is that of the identity matrix. Professor Strang mentions here that "if an eigenvalue is repeated, then there is a whole plane of eigenvectors, and in that plane we can choose perpendicular ones"… a "real substantial freedom."

And he goes on to note that symmetrical matrices can be diagonalized as ${\bf A = Q\Lambda Q'}$. Although he does mention "… I also mean that there is a full set of them [eigenvectors]", it sounds (in the video) as though the ${\bf Q\Lambda Q'}$ is not necessarily jeopardized by the presence of repeat eigenvalues.

However, and in general for square matrices (not limiting ourselves to symmetrical), repeated eigenvalues can render the matrix non-diagonalizable as $\bf{A=S\Lambda S^{-1}}$.

How does all this come together into a question? The ${\bf A'A}$ matrix has many properties shared by positive semidefinite matrices. Among them is its being diagonalizable. Now its eigenvalues do not have to necessarily be distinct (real, yes; but not necessarily of algebraic multiplicity of $1$).

So…

  1. Is it correct to say that symmetrical matrices have always orthogonal eigenvectors (or we can choose them so that they are), guaranteeing the ${\bf Q\Lambda Q'}$ decomposition, regardless of the possible presence of repeat eigenvalues?

  2. If (1) is not true, are we then stuck with a caveat to the assertion that ${\bf A'A}$ matrices are diagonalizable? Can we say that ${\bf A'A}$ is diagonalizable as a blanket statement?

Best Answer

By the spectral theorem, given a symmetric matrix $A$, there exists an orthonormal basis of eigenvectors of $A$. So, yes, a symmetric matrix always has orthogonal eigenvectors.

For example, the identity matrix has its only (repeated) eigenvalue as $1$, yet there does exist an orthonormal basis of eigenvectors of the identity matrix, regardless of the repeated eigenvalue.

In summary, you're correct to say that you can always find a basis of orthogonal eigenvectors given a symmetric matrix, regardless of whether the eigenvectors correspond to repeated eigenvalues.