[Math] Proof of that matrices $A$ and $B$ have the same set of $n$ linearly independent eigenvectors if and only if $AB=BA$.

eigenvalues-eigenvectors

Prove that

$A$ and $B$ have the same set of $n$ linearly independent eigenvectors
$\Leftrightarrow$ $AB=BA$.


I succeed to prove the right way $(\Rightarrow)$ but failed to do the left way $(\Leftarrow)$.

Let $\Lambda^{(a)}$ and $\Lambda^{(b)}$ be the eigenvalue matrix of $A$ and $B$, respectively, i.e.,
$$\Lambda^{(a)} = \begin{bmatrix}\lambda_1 &0&\cdots &0 \\ 0 &\lambda_2 & \cdots &0\\ \vdots&\vdots &\ddots &\vdots\\0&0&\cdots&\lambda_n
\end{bmatrix} \quad \text{and} \quad \Lambda^{(b)} = \begin{bmatrix}\lambda'_1 &0&\cdots &0 \\ 0 &\lambda'_2 & \cdots &0\\ \vdots&\vdots &\ddots &\vdots\\0&0&\cdots&\lambda'_n
\end{bmatrix}.$$

Let $S$ be the eigenvector matrix of both $A$ and $B$, i.e.
$$S = \begin{bmatrix}x_1&x_2&\cdots&x_n
\end{bmatrix}$$
such that $AS=S\Lambda^{(i)}$ for $i\in\{a,b\}$.

Then, from the fact that the diagonal matrix, e.g. $\Lambda^{(a)}$a and $\Lambda^{(b)}$, is commutative, i.e., $\Lambda^{(a)}\Lambda^{(b)}=\Lambda^{(b)}\Lambda^{(a)}$, I can prove that
\begin{align}AB&=(S\Lambda^{(a)}S^{-1})(S\Lambda^{(b)}S^{-1})\\&=S\Lambda^{(a)}\Lambda^{(b)}S^{-1}\\&=S\Lambda^{(b)}\Lambda^{(a)}S^{-1}\\&=(S\Lambda^{(b)}S^{-1})(S\Lambda^{(a)}S^{-1})\\&=BA.\end{align}


Can someone prove the left way?

Best Answer

If $AB=BA$ and $A$ is diagonalizable with distinct eigenvalues, then you can prove they share eigenvectors.

Let $x$ be an eigenvector of $A$ with eigenvalue $\lambda$. Then $A(Bx)=BAx=B\lambda x=\lambda (Bx)$. This shows that $Bx$ is also in the eigenspace corresponding to $\lambda$. But this eigenspace has dimension $1$ (by the assumption), so $Bx$ must be a scalar multiple of $x$, proving $x$ is a eigenvector of $B$.