Linear Algebra – How to Show that det(AB) = det(A) det(B)

determinantlinear algebramatrices

Given two square matrices $A$ and $B$, how do you show that $$\det(AB) = \det(A)\det(B)$$ where $\det(\cdot)$ is the determinant of the matrix?

Best Answer

Let's consider the function $B\mapsto \det(AB)$ as a function of the columns of $B=\left(v_1|\cdots |v_i| \cdots | v_n\right)$. It is straight forward to verify that this map is multilinear, in the sense that $$\det\left(A\left(v_1|\cdots |v_i+av_i'| \cdots | v_n\right)\right)=\det\left(A\left(v_1|\cdots |v_i| \cdots | v_n\right)\right)+a\det\left(A\left(v_1|\cdots |v_i'| \cdots | v_n\right)\right).$$ It is also alternating, in the sense that if you swap two columns of $B$, you multiply your overall result by $-1$. These properties both follow directly from the corresponding properties for the function $A\mapsto \det(A)$.

The determinant is completely characterized by these two properties, and the fact that $\det(I)=1$. Moreover, any function that satisfies these two properties must be a multiple of the determinant. If you have not seen this fact, you should try to prove it. I don't know of a reference online, but I know it is contained in Bretscher's linear algebra book.

In any case, because of this fact, we must have that $\det(AB)=c\det(B)$ for some constant $c$, and setting $B=I$, we see that $c=\det(A)$.


For completeness, here is a proof of the necessary lemma that any a multilinear, alternating function is a multiple of determinant.

We will let $f:\mathbb (F^n)^n\to \mathbb F$ be a multilinear, alternating function, where, to allow for this proof to work in characteristic 2, we will say that a multilinear function is alternating if it is zero when two of its inputs are equal (this is equivalent to getting a sign when you swap two inputs everywhere except characteristic 2). Let $e_1, \ldots, e_n$ be the standard basis vectors. Then $f(e_{i_1},e_{i_2}, \ldots, e_{i_n})=0$ if any index occurs twice, and otherwise, if $\sigma\in S_n$ is a permutation, then $f(e_{\sigma(1)}, e_{\sigma(2)},\ldots, e_{\sigma(n)})=(-1)^\sigma$, the sign of the permutation $\sigma$.

Using multilinearity, one can expand out evaluating $f$ on a collection of vectors written in terms of the basis:

$$f\left(\sum_{j_1=1}^n a_{1j_1}e_{j_1}, \sum_{j_2=1}^n a_{2j_2}e_{j_2},\ldots, \sum_{j_n=1}^n a_{nj_n}e_{j_n}\right) = \sum_{j_1=1}^n\sum_{j_2=1}^n\cdots \sum_{j_n=1}^n \left(\prod_{k=1}^n a_{kj_k}\right)f(e_{j_1},e_{j_2},\ldots, e_{j_n}).$$

All the terms with $j_{\ell}=j_{\ell'}$ for some $\ell\neq \ell'$ will vanish before the $f$ term is zero, and the other terms can be written in terms of permutations. If $j_{\ell}\neq j_{\ell'}$ for any $\ell\neq \ell'$, then there is a unique permutation $\sigma$ with $j_k=\sigma(k)$ for every $k$. This yields:

$$\begin{align}\sum_{j_1=1}^n\sum_{j_2=1}^n\cdots \sum_{j_n=1}^n \left(\prod_{k=1}^n a_{kj_k}\right)f(e_{j_1},e_{j_2},\ldots, e_{j_n}) &= \sum_{\sigma\in S_n} \left(\prod_{k=1}^n a_{k\sigma(k)}\right)f(e_{\sigma(1)},e_{\sigma(2)},\ldots, e_{\sigma(n)}) \\ &= \sum_{\sigma\in S_n} (-1)^{\sigma}\left(\prod_{k=1}^n a_{k\sigma(k)}\right)f(e_{1},e_{2},\ldots, e_{n}) \\ &= f(e_{1},e_{2},\ldots, e_{n}) \sum_{\sigma\in S_n} (-1)^{\sigma}\left(\prod_{k=1}^n a_{k\sigma(k)}\right). \end{align} $$

In the last line, the thing still in the sum is the determinant, although one does not need to realize this fact, as we have shown that $f$ is completely determined by $f(e_1,\ldots, e_n)$, and we simply define $\det$ to be such a function with $\det(e_1,\ldots, e_n)=1$.

Related Question