Eigenvector expansion with non symmetric matrices

eigenvalues-eigenvectorsjordan-normal-formmatrices

I am struggling with the following question:

When is it true that I can use the (right, eventually) eigenvectors of a finite dimensional matrix as a basis and thus write down an eigenvector expansion of a given vector?

I am sure about the fact that an answer to my question is the spectral theorem: for symmetric matrices, I can. Moreover, in this case the eigenvectors are orthogonal.

My problem is more with non-symmetric matrices. Let me explain that:

  • First of all, a "logic" consideration: if we can always use the right eigenvectors of a non-symmetric matrix as a basis, I think it would be a well know result (in fact, the spectral theorem is well known).
  • Looking on books by notable authors, I get confused. Everybody says "if the matrix is symmetric then the eigenvectors are a basis". However, I have never read a statement about non-symmetric matrices. Despite it, some authors just use the right eigenvectors as a basis. Some other, instead, they say that they assume they can use the right eigenvectors as a basis.
  • Finally, the power method and the Jordan form: the power method is based on the fact that a generic vector can be written as a linear combination of eigenvectors of the matrix. It seems to me that the only condition required for this to be possible is for the Jordan form of the matrix to exist. On the other hand, it seems to me that the Jordan form of a matrix always exists. Thus I would conclude that I can always use the right eigenvectors of a non symmetric matrix. But it seems crazy to me that nobody ever says that.

Can somebody explain to me what is the state of the art about eigenvectors and their being a basis?

Best Answer

Eigenvectors for a nonsymmetric matrix need not form a basis. For example, the only eigenvector of the matrix $$ \begin{pmatrix} 0 & 1 \\ 0 & 0 \end{pmatrix} $$ is $(1 \ \ 0)^T$. The issue is that zero is an eigenvalue of algebraic multiplicity $2$, but geometric multiplicity $1$.

There are many of well known statements which are equivalent to the existence of a basis of eigenvectors. A short list of equivalent conditions is

  1. The matrix is diagonalizable.

  2. The Jordan blocks of the matrix are all $1 \times 1$.

  3. The eigenvalues of the matrix have algebraic multiplicity equal to geometric multiplicity.

  4. The minimal polynomial of the matrix has the form $p (x) = \Pi_{i=1}^m (x-\lambda_i)$ where the $\lambda_i$ are distinct.

An easy sufficient condition for existence of a basis of eigenvectors is that the matrix has distinct eigenvalues counting algebraic multiplicity. In some applications authors may assume diagonalizability since a random matrix has distinct eigenvalues with probability 1.

One may define generalized eigenvectors for a specified eigenvalue. Given a matrix $A$, say $v$ is a generalized eigenvector for the eigenvalue $\lambda$ if there is some integer $k$ such that $$ (A-\lambda I)^k v =0. $$ The generalized eigenvectors of a matrix always do form a basis, although I personally do not know where they are used.