[Math] Proof that $\det(AB) = \det(A) \det(B)$

determinantmatricesproof-verification

I have been working on this proof and, after some very helpful comments from people on here, I think I may understand it. I was hoping someone could look this over. I'll try to explain each step in as much depth as I can. I apologize for rookie mistakes: I'm still quite new to this subject.

$\textbf{Theorem.}$ For $n \times n$ matrices $A$ and $B$, $\det(AB) = \det(A) \cdot \det(B)$.

$\textbf{Proof.}$ Take $C = AB$. Any arbitrary column of $C$ will take the form
\begin{align*}
\vec{c}_i & = A \vec{b}_i & & \text{Definition, $C = AB$, and matrix multiplication} \\
& = A \sum_{k=1}^n b_{k,i} \vec{e}_k & & \text{Write each column of $B$ as sum of coefficients multiplied by $k$th standard basis vector} \\
& = \sum_{k=1}^n b_{k,i} A (\vec{e}_k) & & \text{Linearity of $A$} \\
& = \sum_{k=1}^n b_{k,i} \vec{a}_k & & \text{A matrix acting on the $k$th standard basis vector returns the $k$th column of $A$}
\end{align*}
Thus, the matrix $C$ contains $n$ columns of this form $\vec{c}_i$. In other words, we have
$$
\det(C) = \det (\vec{c}_1, \vec{c}_2, \vec{c}_3, \ldots, \vec{c}_n)
$$
or, with somewhat messier notation,
\begin{align*}
\det(C) = \det \Bigg(\sum_{n_1=1}^n b_{n_1, 1} \vec{a}_{n_1} , \sum_{n_2=1}^n b_{n_2, 2} \vec{a}_{n_2}, \sum_{n_3=1}^n b_{n_3, 3} \vec{a}_{n_3}, \ldots, \sum\limits_{n_n=1}^n b_{n_n, n} \vec{a}_{n_n}\Bigg)
\end{align*}
Next, we can draw on the multilinearity property of the determinant function, which tells us that determinant function is linear in every column of its argument. This means that if one of our vectors is a linear combination of some other number of vectors, say (with $\vec{a}_i$ as some arbitrary column of some matrix $A$), $\vec{a}_i = \alpha \vec{b} + \beta \vec{c}$, we can write
\begin{align*}
\det(A) = \det(\vec{a}_1, \vec{a}_2, \vec{a}_3, \ldots, \vec{a}_i, \ldots \vec{a}_n) & = \det(\vec{a}_1, \vec{a}_2, \vec{a}_3, \ldots, \alpha \vec{b} + \beta \vec{c}, \ldots \vec{a}_n) \\
& = \alpha \det(\vec{a}_1, \vec{a}_2, \vec{a}_3, \ldots, \vec{b}, \ldots \vec{a}_n) + \beta \det(\vec{a}_1, \vec{a}_2, \vec{a}_3, \ldots, \vec{c}, \ldots \vec{a}_n)
\end{align*}
This then allows us to pull out our coefficients, of the form $b_{n_k, k}$, summed over $n_1, n_2, n_3, \ldots, n_n$. This gives us, after factoring out $det(\vec{a}_1, \vec{a}_2, \vec{a}_3, \ldots, \vec{a}_n) = \det(A)$:
\begin{align*}
\det(C) = \sum\limits_{n_1, n_2, n_3, n_4, \ldots, n_n} b_{n_1, 1} b_{n_2, 2} \cdots b_{n_n, n} \cdot det(\vec{a}_1, \vec{a}_2, \vec{a}_3, \ldots, \vec{a}_n) & = \sum\limits_{n_1, n_2, n_3, n_4, \ldots, n_n} b_{n_1, 1} b_{n_2, 2} \cdots b_{n_n, n} \cdot \det(A)
\end{align*}
Let's now denote $\sum\limits_{n_1, n_2, n_3, n_4, \ldots, n_n} b_{n_1, 1} b_{n_2, 2} \cdots b_{n_n, n}$, fixed for any $A$, as $f(B)$, and thus we have
$$
\det(C) = \det(AB) = f(B) \cdot \det(A)
$$
Since $f(B)$ is constant, it is fixed for any matrix $A$. Take $A = I$, in which case $\det(A) = \det(I) = 1$, and also $\det(C) = \det(AB) = \det(IB) = \det(B)$. We thus get
$$
\det(B) = f(B) \cdot 1 = f(B)
$$
Thus, $\det(B) = f(B)$. Thus, $\det(C) = f(B) \cdot \det(A) \implies \det(C) = \det(B) = \det(A)$, and thus $\det(AB) = \det(A) \cdot \det(B)$, as desired.

How does this proof look? My apologies for the rather messy notation.

Best Answer

First: Typo at end. $\det(C) = \det(B) = \det(A)$ is incorrect.

Second: Perhaps the indices $k_1, \dots, k_n$ would be better than $n_1, \dots, n_n$.

Third: The statement $f(B) = \det(B)$ cannot be correct. For instance, consider the matrix $\begin{pmatrix} 1 & 1 \\ 1 & 1\end{pmatrix}$, whose determinant is $0$, but $f(B) > 0$. There are signs missing in the expression for $f(B)$.

Fourth: The place that you got this incorrect is when you pulled out the $b$'s and then you got $\det(\vec a_1, \dots, \vec a_n)$, but you should have gotten $\det(\vec a_{n_1}, \dots, \vec a_{n_n})$. This is why there are missing signs in your function $f(B)$.

Related Question