Eigenvalues of a matrix containing an unknown matrix

determinanteigenvalues-eigenvectorslinear algebramatrices

How to find eigenvalues of a symmetric matrix
$$B = \begin{bmatrix}
2\mathrm{I_{m}} & A^\intercal \\\\ A & 0
\end{bmatrix}$$

without knowing anything about $A$ besides that $A \in \mathbb{R}^{n \times m}$ ?

Finding the determinant of
$$B – \lambda \mathrm{I} = \begin{bmatrix}
(2-\lambda) \mathrm{I_{m}} & A^\intercal \\\\
A & -\lambda \mathrm{I_n}
\end{bmatrix}$$

analytically seems to be unfeasible as it's $( n+m ) \times ( n+m )$ matrix with possibly large $(n+m)$. Is this correct or is there something special that would make it doable? I tried to proceed with matrices as scalars, i.e. apply Leibniz formula anyways, which gave $(\lambda^2 – 2\lambda)\mathrm{I_m}=A^{T}A$ but I'm pretty sure this approach is wrong.

Can you think of any other way to obtain the eigenvalues of $B$, without finding the determinant ?

edit: $B$ may be invertible or not (I guess it depends on $A$), both cases are relevant for me.

Best Answer

Suppose $\begin{bmatrix} v_1 \\ v_2 \end{bmatrix}$ is an eigenvector with eigenvalue $\lambda$. Then \begin{align} 2 v_1 + A^T v_2 &= \lambda v_1 \\ A v_1 &= \lambda v_2 \end{align} from which you conclude that $$ A^T A v_1 = \lambda ^2 v_1 - 2 \lambda v_1 .$$ So computing the eigenvalues of $B$ boils down to computing the eigenvalues of $A^T A$, and this is equivalent to finding the singular values of $A$. Thus there is no short cut.

Related Question