[Math] a block matrix proof about characteristic polynomials

characteristic-functionseigenvalues-eigenvectorsinductionlinear algebraproof-writing

If

$\hspace{2in}$$
A =
\begin{bmatrix}
B & 0 \\
C & D
\end{bmatrix} \in \mathsf{M}_n, $

where $ B \in \mathsf{M}_k $ and $ D \in \mathsf{M}_{n-k}$, prove that $ p_A = p_B p_D $. ${Hint}$: proceed by induction on $n$ and expand the determinant across the first row.

i have no idea what to do. All i know is that $p_A (t) = \det(tI_n-A)$ , $p_B (t) =\det(tI_n-B)$ and that $p_D(t) = \det(tI_{n-k}-D) $

i also feel like you can prove this without induction by saying that $\det(A) = BC$

but i also feel like that is totally incorrect

What should i do? how do i prove this?

if you have a better title feel free to chage it

how would induction even play into this?

Best Answer

This is really not about characteristic polynomials at all, just a fundamental property of determinants (over any commutative ring $R$, where here we take $R$ to be the ring of polynomials in $t$ over your field), namely $$ \det\pmatrix{B&0\\C&D}=\det(B)\det(D). $$ You can apply this immediately for the characteristic polynomial, since the act of transforming $A$ into $xI_n-A$ amounts to transforming $B$ into $tI_k-A$, and $D$ into $xI_{n-k}-D$ (also $C$ becomes $-C$).

That property of determinants is the subject of this other question, and in my opinion the best proof is really directly from the (Leibniz formula) definition of determinants, as I detailed in my answer to that question. In particular, I would want to avoid using a property like $\det(XY)=\det(X)\det(Y)$, which although of course true, is actually quite a bit harder to prove directly from the definition.