If A has distinct eigenvalues, and A and B commute, then A’s eigenvectors are B’s eigenvectors

linear algebramatrices

The usual proof I see of this statement is this:

Starting from $Ax=λx$, we have

$ABx=BAx=Bλx=λBx$

Thus $x$ and $Bx$ are both eigenvectors of $A$, sharing the same $λ$ (or else $Bx=0$). If we assume for convenience that the eigenvalues of $A$ are distinct – the eigenspaces are one dimensional – then $Bx$ must be a multiple of $x$. In other words x is an eigenvector of $B$ as well as $A$.

But I don't follow this, because the eigenvalue for $Bx$ is still $\lambda$ above. If we have that $Bx = \frac{\lambda_2}{\lambda} x$, that just means that $ABx = \lambda_2x$, which isn't even the eigenvalue equation. So is the above statement even true?

Best Answer

The vectors $x$ and $Bx$ both belong to the eigenspace of $A$ relative to $\lambda$. Since this is one-dimensional, $x$ makes a basis for it, so certainly $$ Bx=\mu x $$ for some $\mu$. Therefore $x$ is an eigenvector for $B$.

The statement doesn't say that $A$ and $B$ have the same eigenvalues. It says that the eigenvectors of $A$ are also eigenvectors for $B$.

The eigenvalues of $A$ and $B$ can actually be different. Consider the trivial case where $B$ is the identity matrix and $$ A=\begin{bmatrix} 2 & 0 \\ 0 & 3 \end{bmatrix} $$ Then $AB=BA$ and (obviously) every eigenvector of $A$ is an eigenvector of $B$; the converse is not true, in this case, because every nonzero vector in $\mathbb{R}^2$ is an eigenvector for the identity matrix, which is not the case for $A$.