Matrix multiplication and determinant

determinantlinear algebramatrices

Let $A$ and $B$ be two matrices. The rows of matrix $AB$ can be seen as the linear combinations of the rows of matrix $B$. Thus, if the determinant of a matrix is invariant to row operations, shouldn't $\det(AB) = \det(B)$?

I know it's $\det(AB) = \det(A) \det(B)$. Where is the fallacy in my argument? Thanks in advance.

Best Answer

$\det(B)$ is a measurement of the "size" of (the rows of) $B$ in a certain sense. You are correct that $AB$ is a linear combination of rows of $B$, but linear combinations do not preserver size. For a slightly different example, consider the vectors $\{(1, 0), (0,1)\}$. They each have length $1$, but we certainly don't expect all linear combinations of them to have length $1$! For another example, $||\frac{1}{x^2+1}||_2=\frac{\pi}{2}$ and $||\sin(x)/x||_2=\sqrt{\pi}$. But I can make $||a\frac{1}{x^2+1} + b\frac{\sin(x)}{x}||_2$ as large as I like by picking appropriate $a$ and $b$.

Typically, there are special types of linear transformations that do preserve size. In the case of vectors in $\mathbb{R}^k$, these are rotations. For the case of matrices, they are precisely multiplication by matrices of determinant $1$.