[Math] Does an n by n Hermitian matrix always has n independent eigenvectors

eigenvalues-eigenvectorslinear algebramatrices

I am learning the MIT ocw 18.06 Linear Algebra, and I have learnt:
an arbitrary $n×n$ matrix A with n independent eigenvectors can be written as
$A=SΛS^{-1}$, and then for the Hermitian matrices, because the eigenvectors can be chosen orthonormal, it can be written as $A=QΛQ^H$ further.

I wonder does every $n×n$ Hermitian matrix has n independent eigenvectors and why?
Thank you!

P.S.
MIT 18.06 Linear Algebra Lecture 25: Symmetric Matrices and Positive Definiteness.
You may wish to start from 4:20. From the course, I think the spectral theorem comes from diagonalizable matrix, it's just a special case, it's just the case eigenvectors are orthonormal. The eigenvectors of Hermitian matrices can be chosen orthnormal, but is every Hermitian matrix diagonalizable? If it is, why?

Best Answer

This is a theorem with a name: it is called the Spectral Theorem for Hermitian (or self-adjoint) matrices. As pointed out by Jose' Carlos Santos, it is a special case of the Spectral Theorem for normal matrices, which is just a little bit harder to prove.

Actually we can prove the spectral theorem for Hermitian matrices right here in a few lines.

We are going to have to think about linear operators rather than matrices. If $T$ is a linear operator on a finite dimensional complex inner product space $V$, its adjoint $T^*$ is another linear operator determined by $\langle T v, w\rangle = \langle v, T^* w \rangle$ for all $v, w \in V$. (Note this is a basis-free description.) $T$ is called Hermitian or self-adjoint if $T = T^*$.

Let $B$ be an $n$-by-$n$ complex matrix and $B^*$ the conjugate transpose matrix. Let $T_B$ and $T_{B^*}$ be the corresponding linear operators. Then $(T_B)^* = T_{B^*}$, so a a matrix is Hermitian if and only if the corresponding linear operator is Hermitian.

Let $A$ be a Hermitian linear operator on a complex inner product space $V$ of dimension $n$. We need to consider $A$--invariant subspaces of $V$, that is linear subspaces $W$ such that $A W \subseteq W$. We should think about such a subspace as on an equal footing as our original space $V$. In particular, any such subspace is itself an inner product space, $A_{|W} : W \to W$ is a linear operator on $W$, and $A_{|W}$ is also Hermitian. If $\dim W \ge 1$, $A_{|W}$ has an least one eigenvector $w \in W$ -- because any linear operator at all acting on a (non-zero) finite dimensional complex vector space has at least one eigenvector.

The basic phenomenon is this: Let $W$ be any invariant subspace for $A$. Then $W^\perp$ is also invariant under $A$. The reason is that if $w \in W$ and $x \in W^\perp$, then $$ \langle w, A x\rangle = \langle A^* w , x \rangle = \langle A w, x \rangle = 0, $$ because $Aw \in W$ and $x \in W^\perp$. Thus $A x \in W^\perp$.

Write $V = V_1$. Take one eigenvector $v_1$ for $A$ in $V_1$. Then $\mathbb C v_1$ is $A$--invariant. Hence $V_2 = (\mathbb C v_1)^\perp$ is also $A$ invariant. Now just apply the same argument to $V_2$: the restriction of $A$ to $V_2$ has an eigenvector $v_2$ and the perpendicular complement $V_3$ to $\mathbb C v_2$ in $V_2$ is $A$--invariant. Continuing in this way, one gets a sequence of mutually orthogonal eigenvectors and a decreasing sequence of invariant subpsaces, $V = V_1 \supset V_2 \supset V_3 \dots$ such that $V_k$ has dimension $n - k + 1$. The process will only stop when we get to $V_n$ which has dimension 1.

Related Question