Eigenvalue of the sum of symmetric and anti-symmetric matrices

eigenvalues-eigenvectorslinear algebra

I'd like to get a rough estimation of eigenvalues of a structured matrix $A$, which is the sum of a symmetric and anti-symmetric (or skew-symmetric) matrix, i.e., $A = M + N$ where $M = M^T$ and $N = – N^T$.

In particular, I'm interested in the case where M is block-diagonal in the sense that $M = \begin{bmatrix} M_1 & \\ &M_2\end{bmatrix}$ and $N = \begin{bmatrix} & N_1 \\ -N_1& \end{bmatrix}$. Further, I assume $M_1, M_2$ are symmetric positive-definite with the smallest eigenvalues $\mu_1, \mu_2$ respectively and $N_1$ has the largest eigenvalue $L$ (One example is $A = \begin{bmatrix} 2 & 5 \\ -5 & 6\end{bmatrix}$). In this case, it is known that all eigenvalues $\lambda_i$ can be written as the form of $\lambda_i = a_i + b_i j$, but what's the range of $a_i$ and $b_i$?

My guess: $a_i >= min\{\mu_1, \mu_2\}$ and $-L <= b_i <= L$. If so, is the lower-bound of $a_i$ sharp?

Best Answer

As is shown here, if $B + B^*$ is positive semidefinite, then the eigenvalues of $B$ must have positive real part.

With that established, let $\mu = \min\{\mu_1,\mu_2\}$. Note that $B = A - \mu I$ is such that $B + B^*$ is positive semidefinite. It follows that the eigenvalues of $B$ have non-negative real part. Note, however, that these eigenvalues are given by $$ \lambda_i(B) = 2[(a_i - \mu) + jb_i] . $$ Similarly, setting $B = jA \pm L I$, we find that $$ B + B^* = j2\pmatrix{0&N_1\\-N_1 & 0} \pm 2 L I. $$ Verify that the eigenvalues of $$ j\pmatrix{0&N_1\\-N_1 & 0} $$ are given by $\pm \lambda_i(N_1)$. With that, applying the same logic as before lets us conclude that we must have $-L \leq b_i \leq L$ for all $i$, as you conjectured.

As for the sharpness of this bound: it suffices to consider the case of $M_1 = M_2 = N_1 = I$.