Does every (complex) $n\times n$ matrix have n linearly independent eigenvectors

eigenvalues-eigenvectorslinear algebramatrices

Every $n\times n$ matrix will produce an $n$-th order characteristic equation. This polynomial will have $n$ distinct roots or less than $n$ distinct roots (with some repeated).

In the former case, each eigenvalue corresponds to a distinct eigenvector, so you can form an eigenbasis of $n$ linearly independent eigenvectors.

In the latter case, I can see two possibilities:

  1. There are still $n$ linearly independent eigenvectors, but some share the same eigenvalue, so a $k$-fold root of the characteristic equation (an eigenvalue repeated $k$ times) will correspond to $k$ linearly independent eigenvectors.

  2. There are less than $n$ linearly independent eigenvectors, so a $k$-fold root may correspond to less than $k$ linearly independent eigenvectors.

Is one of these two possibilities always true, for all complex $n\times n$ matrices, or could either possibility be true depending on the matrix in question?

Is there a proof of this, or perhaps a counter example?

Best Answer

Both possibilities apply. Case (1) means the matrix is diagonalisable over $\mathbb{C}$, of which there clearly are examples. However matrices like $$\begin{bmatrix}1 & 1 \\ 0 & 1\end{bmatrix}$$ are ones that are not diagonalisable over $\mathbb{C}$; this one would fit your case (2).

The reason why we would consider eigenvalues over $\mathbb{C}$ rather than over $\mathbb{R}$ is precisely because the characteristic polynomial will split as a products of its roots, which does not always happen over $\mathbb{R}$. Hence we always find (counting multiplicities) $n$ eigenvalues, but it turns out this is not sufficient for diagonalisability.

Related Question