Linear Algebra – Determinant Identity: Detailed Explanation

determinantlinear algebramatrices

Let $M$ be a (real) $n \times n$ matrix. For $1 \leq i, j \leq n$ we denote by $M_{ij}$ the $(n-1) \times (n-1)$ matrix that we get when the $i$th row and $j$th column of $M$ are removed. Now, consider fixed $i$ and $j$ with $i\neq j$. Let $N$ be the $(n-2) \times (n-2)$ matrix that we get when removing both the $i$th and $j$th row and the $i$th and $j$th column from $M$. Then the following identity holds:
$$
\det M \det N = \det M_{ii}\det M_{jj} – \det M_{ij} \det M_{ji}.
$$

We were able to prove this looking at all different terms that can occur on both sides when evaluating the determinant as a sum of $n!$ terms (for instance, terms containing $a_{ji}a_{ij}$ are counted both $1$ time in the LHS and $1$ time in the RHS).

We are looking for a slick proof that does not involve writing out the determinant.

Any suggestions or approaches to this problem are appreciated!

Best Answer

Amazingly, this is exactly Lemma 3 in my paper "Almost all integer matrices have no integer eigenvalues" with Erick B. Wong (other than restricting to $i=1,j=2$ which is trivially generalized). There's a short proof there which seems to fit your needs. There's also a reference to a book on curious determinant identities of this sort in the bibliography (which is where we got the proof in the first place).

Related Question