[Math] How do row operations affect $\det(U)$

determinantlinear algebramatrices

'We can do row operations without changing $\det(A)$' – A quote from Introduction to Linear Algebra by G. Strang

But let's say I have an arbitrary upper triangular matrix $U$

$$U = \begin{bmatrix}
a & a & a \\
0 & b & b \\
0 & 0 & c \\
\end{bmatrix}$$

And I perform the following row operations on $U$ to bring it to $U'$

$\frac{1}{a}R_1 \rightarrow R_1$

$\frac{1}{b}R_2 \rightarrow R_2$

$\frac{1}{c}R_3 \rightarrow R_3$

Then $U'$ is:

$$U' = \begin{bmatrix}
1 & 1 & 1 \\
0 & 1 & 1 \\
0 & 0 & 1 \\
\end{bmatrix}$$

But now $\det(U) = abc$ and $\det(U') = 1$, thus $$\det(U) \neq \det(U')$$

All I've done is perform row operations on $U$ to bring it to $U'$, but by performing those row operations, their determinants lose equality. How can that be possible?


So how is this seeming contradiction is resolved. I'm assuming that I must have some misconception either on row operations or on determinants.

Furthermore on a deeper level, what geometric interpretation/meaning does scaling the rows as I've done bringing $U$ to $U'$, have on the determinant? Since the determinants of $U$ and $U'$ are obviously no longer equal, geometrically what is this scaling doing to the determinant?

Best Answer

The determinant is an alternating multilinear function on the rows (or the columns, as you wish) of the matrix. What interests us here is the multilinearity.

So here, $ \det(U) = \det(R_1, R_2, R_3)$. The matrix $U'$ obtained as you propose has determinant $\det(\frac{R_1}a, \frac{R_2}b, \frac{R_3}c)$, ie, by multilinearity, $$\begin{align*}\det(U') &= \frac 1a \det(R_1, \frac{R_2}b, \frac{R_3}c)\\ &= \frac 1{ab} \det(R_1, R_2, \frac{R_3}c)\\ &= \frac 1{abc} \det(R_1, R_2, R_3)\\& = \frac 1{abc} \det(U)\end{align*}$$

As @quid mentions in his comment, row operations are about adding a multiple of a row to another row. This operation does not affect the determinant because of its alternating and multilinear behaviour:

$\det(R_1, R_2 + a R_1, R_3) = \det(R_1, R_2, R_3) + a\det(R_1, R_1, R_3)$ by linearity, and since it is alternating, $\det(R_1, R_1, R_3) = 0$ so $\det(R_1, R_2 + a R_1, R_3) = \det(R_1, R_2, R_3)$


Edit: Geometrically speaking, the determinant is the hypervolume of the $n^{th}$-dimensional parallelogram generated by the rows of your matrix, taken as vectors. So, if your matrix is not invertible, ie its rows are linearly dependent, then the rows form a parallelogram of volume $0$, and so on.
For instance, for a $2\times2$ matrix, the determinant is the volume (=surface) of the parallelogram generated by your two rows. If these rows are linearly dependent, you can see that your parallelogram is flat, so has a volume (=surface) of $0$.

Dividing a row by $a$ means, for the determinant, dividing one of the lengths of your parallelogram by $a$, hence dividing its volume by $a$.