Eigenvalues $\lambda$ and $\lambda^{-1}$ of an orthogonal matrix

eigenvalues-eigenvectorslinear algebra

I know that $A$ is an orthogonal matrix and $\lambda$ is an eigenvalue of $A$. I need to prove that $\lambda \neq 0$ and that $\lambda^{-1}$ is an eigenvalue of both $A$ and $A^\top$.

Now if $A$ is orthogonal, then $A^\top$ is its inverse matrix so we can't have an eigenvalue of $0$. Since $\lambda$ is an eigenvalue of $A$ we can verify that:

$$AX = \lambda \cdot X \Leftrightarrow$$ $$\Leftrightarrow A^\top AX = \lambda \cdot A^\top X \Leftrightarrow$$ $$\Leftrightarrow X = \lambda \cdot A^\top X \Leftrightarrow$$ $$\Leftrightarrow \lambda^{-1} \cdot X = \lambda^{-1} \cdot \lambda \cdot A^\top X \Leftrightarrow$$ $$\Leftrightarrow A^\top X = \lambda^{-1} \cdot X$$.

I'm having a hard time proving the same for $\lambda^{-1}$ though…

Best Answer

The eigenvalues of a matrix are the same as the eigenvalues of its transpose. (Look at the definition of the characteristic polynomial and note that determinants are invariant under transposes.)