Show that eigenvalues are symmetric with respect to the origin

eigenvalues-eigenvectorsmatricessymmetric matrices

The matrices that I am considering are

$$
M =
\begin{bmatrix}
A & B \\
C & -A^\top
\end{bmatrix},
$$

with $A,B,C\in\mathbb{C}^{n\times n}$, $C = C^\top$ and $B = B^\top$. I noticed from numerical calculations that the eigenvalues are symmetric with respect to the origin, i.e., if $\lambda$ is an eigenvalue of $M$ then $-\lambda$ is as well.

I have not been able to shown that this has to be the case. Initially I tried to see if I could come up with a similarity transformation which would make it block diagonal, with one block the negative of the other, but with no success. Next I tried to use some tricks to manipulate the determinant, namely I showed that

\begin{align}
\det(M – \lambda\,I) &= \det(B) \det((-A^\top – \lambda\,I) B^{-1} (A – \lambda\,I) – C) \\
&= \det(C) \det((A – \lambda\,I) C^{-1} (-A^\top – \lambda\,I) – B) \\
\end{align}

assuming that either $\det(B)\neq0$ or $\det(C)\neq0$. However, this did not seem to bring me any closer to showing that all eigenvalues are mirrored around the imaginary axis.

Best Answer

The definition of $M$ is the same as that of an Hamiltonian matrix. In order to show that if $\lambda$ is an eigenvalue then $-\lambda$ is an eigenvalue as well, the following identity can be used

$$ \det(M - \lambda\,I) = \det(M^\top - \lambda\,I) $$

Defining a new matrix as

$$ J = \begin{bmatrix} 0 & -I \\ I & 0 \end{bmatrix}, $$

where it can be noted that $\det(J)=-1$ and $J^{-1} = J^\top = -J$. Using this matrix it can be shown that $J\,M^\top J = M$, namely

\begin{align} J\,M^\top J &= \begin{bmatrix} 0 & -I \\ I & 0 \end{bmatrix} \begin{bmatrix} A & B \\ C & -A^\top \end{bmatrix}^\top \begin{bmatrix} 0 & -I \\ I & 0 \end{bmatrix} \\ &= \begin{bmatrix} 0 & -I \\ I & 0 \end{bmatrix} \begin{bmatrix} A^\top & C \\ B & -A \end{bmatrix} \begin{bmatrix} 0 & -I \\ I & 0 \end{bmatrix} \\ &= \begin{bmatrix} 0 & -I \\ I & 0 \end{bmatrix} \begin{bmatrix} C & -A^\top \\ -A & -B \end{bmatrix} \\ &= \begin{bmatrix} A & B \\ C & -A^\top \end{bmatrix} \end{align}

Incorporating $J$ into the determinant from earlier allows use to show

\begin{align} \det\left(J(M^\top - \lambda\,I)J\right) &= \det(J)^2\det\left(M^\top - \lambda\,I\right) = \det\left(M^\top - \lambda\,I\right) \\ &= \det\left(J\,M^\top J - \lambda\,J^2\right) \\ &= \det\left(M + \lambda\,I\right) \end{align}

Therefore $\det\left(M - \lambda\,I\right) = \det\left(M + \lambda\,I\right)$. So if $\lambda$ is an eigenvalue $\det\left(M - \lambda\,I\right)$ is zero, but then $\det\left(M + \lambda\,I\right)$ would be zero as well and thus $-\lambda$ has to be an eigenvalue as well.

Related Question