Linear Algebra – Determinant of 2×2 Block Matrix with Commuting Blocks

block matricescontest-mathdeterminantlinear algebramatrices

Let $A,B,C$ and $D$ be $n \times n$ matrices such that $AC = CA$. Prove that $$\det \begin{pmatrix} A & B\\ C & D\end{pmatrix} = \det(AD-CB)$$


The solution is to first assume that $A$ is invertible and then consider the product

$$\begin{pmatrix}
I & O\\
-CA^{-1} & I
\end{pmatrix}\begin{pmatrix}
A & B\\
C & D
\end{pmatrix}=\begin{pmatrix}
A & B\\
O & D-CA^{-1}B
\end{pmatrix}$$

then it is not hard to prove that the claim is true if $A$ in invertible. Finally, we use the fact that the set $GL_n$ form a dense open subset of $M_n$ to get rid of the invertibility assumption. My question is: how to come up with such a weird matrix $\begin{pmatrix}
I & O\\
-CA^{-1} & I
\end{pmatrix}$
? thank you so much. Is there any other problems that uses the technique of assuming invertibility? (one of which i know is to prove $\det (I+AB) = \det(I+BA)$), thanks in advance

Best Answer

As the others have pointed out, the block matrix originates form the elementary matrix in Gaussian elimination. The ultimate goal here is to transform the matrix in question into a block-triangular form. Yet, in general, such uses of block matrices serve different goals. So, rather than explaining how to devise the block matrix in this specific case, I think it is more important to note that you can sometimes simplify your problem by using a transformation matrix in block form.

Your proof actually has three features that are quite common in various proofs:

  1. It tries to prove the assertion for invertible $A$ first.
  2. In the course of proof, a block matrix is used to transform the problem into a simpler one.
  3. Then, a continuity argument (invertible matrices are dense in $M_n(\mathbb{R})$ or $M_n(\mathbb{C})$) is used to extend the proof for a general $A$.

I think it is worthwhile to put all these three tricks in your toolbox:

  1. Continuity arguments with invertible matrices can be used in many cases. One typical scenario that immediately comes to mind is to prove some inequalities that involve positive semidefinite matrices. For a toy example, suppose we want to show that $\det(A+B)\ge\det(A)+\det(B)$, where $A$ and $B$ are positive semidefinite. If $A$ is invertible (hence positive definite), we can pull out $A^{1/2}$ from left and right on both sides and get the equivalent but very easy inequality $\det(I+X)\ge1+\det(X)$, where $X=A^{-1/2}BA^{-1/2}$ is positive semidefinite.
  2. The use of block matrices to simplify/transform problems have many applications, such as the derivation of Schur complement and the equality of the characteristic polynomials of $AB$ and $BA$. On more advanced uses, see the comment of Branimir Ćaćić on my answer to another question from a higher perspective.
  3. Continuity arguments per se (which are not necessarily bound to invertible matrices) are very handy too. One example is the Cayley-Hamilton theorem. If $A$ is a complex square matrix that is diagonalisable as $A=SDS^{-1}$, where $D=\operatorname{diag}(\lambda_1,\ldots,\lambda_n)$, then its characteristic polynomial is equal to $p(x)=\prod_{k=1}^n(x-\lambda_k)$ and obviously, $p(A)=S\,p(D)\,S^{-1}=0$. Since diagonalisable matrices are dense in $M_n(\mathbb{C})$ and the coefficients of the characteristic polynomial of a matrix $A$ are continuous in the entries of $A$, the Cayley-Hamilton theorem holds for all complex matrices too.
  4. However, the major drawback of continuity arguments is that they do not work on all fields. In your case, consider the following proof by John Silvester (see this paper). It works on all fields. You don't need to assume that $A$ is invertible. Neither is a continuity argument necessary. First, note that $$ \pmatrix{I&0\\ -C&A+xI}\pmatrix{A+xI&B\\ C&D}=\pmatrix{A+xI&B\\ AC-CA&(A+xI)D-CB}. $$ By assumption, $AC-CA=0$, so the RHS is block-triangular and $$\det(A+xI)\,\det\pmatrix{A+xI&B\\ C&D}=\det(A+xI)\,\det\left((A+xI)D-CB\right).$$ As $\det(A+xI)$ is a nonzero polynomial in $x$, we can divide both sides by it and obtain $\det\pmatrix{A+xI&B\\ C&D}=\det\left((A+xI)D-CB\right)$. Put $x=0$, the assertion follows.
Related Question