Linear Algebra – Eigenvalue of Multiplicity k of a Real Symmetric Matrix Has Exactly k Linearly Independent Eigenvectors

eigenvalues-eigenvectorslinear algebrasymmetric matrices

If A is an nxn real symmetric matrix then A is diagonalisable. In other words, If A is a symmetric nxn matrix, then there exists an orthogonal matrix $P$ such
that $P_{-1}AP=P_{T}AP=D$, a diagonal matrix. The eigenvalues of A lie on the
main diagonal of D.

Proof of this statement must be considered in two cases.

It is clear that all eigenvalues of a real symmetric matrix are real and if they are all distinct, then eigenvectors $x_i$, i=1,2,..,n corresponding to $\lambda_i$ i=1,2,..,n are orthogonal. We can obtain an orthonormal set of eigenvectors eigenvectors $u_i$, i=1,2,..,n using these eigenvectors $x_i$, i=1,2,..,n Then we can construct matrix $P$ as $P=[u_1 u_2 … u_n]$ such that $D=P_{-1}AP$.

I have some troubles in the second case that is not all eigenvalues are simple.

How can I prove that eigenvalue of multiplicity k of a real symmetric matrix A has exactly k linearly independent eigenvectors, i.e., dimension of solution space of $(A-\lambda I)$ is k?

Best Answer

I'd say you have three ways:

  1. Prove first that a symmetric matrix is diagonalizable. Then see that your desired property is true for diagonal matrices and that a similarity transform preserves both algebraic and geometric multiplicities (eg: see prop 7.5 here). I think this is the most usual way.

  2. Continuity argument: in the space of symmetric matrices, we can go continuosly from different eigenvalues to repeated eigenvalues; because the starting point had orthogonal eigenvectors, these cannot degenerate in LD eigenvectors. This argument is easy to visualize but hard to formalize.

  3. Prove it directly. Here's an sketch (essentially from here)

Let ${\bf A}$ be a $n \times n$ symmetric matrix, let $\lambda_i$ be an eigenvalue with (algebraic) multiplicity $1<m \le n$. Then there exists some eigenvector ${\bf p}_{i1}$ with $|{\bf p}_{i1}|=1$.

Let ${\bf B}=({\bf p}_{i1} \, {\bf C})$ be an orthogonal matrix (orthonormal columns) with ${\bf p}_{i1}$ as first column (it can be constructed by Grand-Schmidt process). Then consider

$$ {\bf B}' {\bf A} {\bf B} = \begin{pmatrix} \lambda_i &0 \\ 0 & {\bf C}' {\bf A} {\bf C} \end{pmatrix}$$

By considering the characteristic polynomial, we see that (because the multiplicity of $\lambda_i>1$)$|{\bf C}' {\bf A} {\bf C} -\lambda_i I_{n-1}|=0$ Hence there exists some non null ${\bf q}$ with $({\bf C}' {\bf A} {\bf C} -\lambda_i I_{n-1})){\bf q}=0$. Next, see that ${\bf p}_{i2}={\bf C}{\bf q}$ is eigenvector of ${\bf A}$, and is orthogonal to ${\bf p}_{i1}$. We can repeat the procedure (if $m>2$), by rebuilding ${\bf B}$ with this eigenvector as second column, etc.

Related Question