[Math] Do all diagonalisable matrices have orthogonal eigenvectors

eigenvalues-eigenvectorslinear algebramatrices

So. Let us label right-eigenvectors of $M$ as $|\lambda\rangle$ and left-eigenvectors of $M$ as $\langle\lambda|$. That is,
\begin{gather}
M |\lambda\rangle = \lambda |\lambda\rangle \text{ and } \langle\lambda| M = \langle\lambda| \lambda .
\end{gather}
This makes sense (I hazard) because a left-eigenvector is adjoint, which doesn't have to mean ‘conjugate transpose’, to a right-eigenvector via a shared eigenvalue. Now, we may write
\begin{gather}
M|\lambda\rangle = \lambda |\lambda\rangle \text{ and } \langle\mu| M = \langle\mu| \mu ,
\end{gather}
which implies that both
\begin{gather}
\langle \mu | M | \lambda \rangle = \lambda \langle \mu | \lambda \rangle \text{ and } \langle \mu | M | \lambda \rangle = \mu \langle \mu | \lambda \rangle
\end{gather}
which immediately implies, upon the assumption that
\begin{gather}
\lambda \neq \mu ,
\end{gather}
that
\begin{gather}
\langle\mu|\lambda\rangle = 0 .
\end{gather}
What have I done wrong?

Best Answer

The answer is yes, but to a different question: Do all diagonalizable matrices have biorthogonal right and left eigenvectors?

To see this, let the matrix $A\in\mathbb{C}^{n\times n}$ be diagonalizable so that there exists a nonsingular matrix $X=[x_1,\ldots,x_n]$ and diagonal $\Lambda=\mathrm{diag}(\lambda_1,\ldots,\lambda_n)$ such that $$A=X\Lambda X^{-1}.$$

A nonzero vector $x$ is called right eigenvector of $A$ (associated with an eigenvalue $\lambda$), if $Ax=\lambda x$ for some scalar $\lambda$. A nonzero vector $y$ is called left eigenvector (associated with an eigenvalue $\lambda$), if $y^*A=\lambda y^*$ for some scalar $\lambda$ (or equivalently, $A^*y=\bar{\lambda} y$).

The columns $x_1,\ldots,x_n$ of the matrix $X$ above are the right eigenvectors of $A$. To see this, post-multiply $A=X\Lambda X^{-1}$ by $X$ to get $AX=X\Lambda$ and look on each column of the matrix equation, which states that $Ax_i=\lambda_ix_i$, $i=1,\ldots,n$.

We have also that $X^{-1}A=\Lambda X^{-1}$ (simply pre-multiplying $A=X\Lambda X^{-1}$ by $X^{-1}$), that is, $Y^*A=\Lambda Y^*$ with $Y=[y_1,\ldots,y_n]=(X^{-1})^*$. The columns $y_1,\ldots,y_n$ of $Y$ are the left eigenvectors of $A$. To see that, look on the rows of $Y^*A=\Lambda Y^*$: $y^*_iA=\lambda_i y_i^*$, $i=1,\ldots,n$.

The right eigenvectors $\{x_i\}_{i=1}^n$ and left eigenvectors $\{y_i\}_{i=1}^n$ are biorthogonal, that is, $y_i^*x_j=\delta_{ij}$, where $\delta_{ij}$ is the Kronecker delta symbol. This can be easily verified using the relation between $X$ and $Y$, because $y_i^*x_j=\delta_{ij}$ iff $Y^*X=I$ (where $I$ is the identity matrix) and $Y^*X=((X^{-1})^*)^*X=X^{-1}X=I$.

P.S.: Sorry for not using your notation :-)