Intuitively, why is multiplying a row of matrix elements with the corresponding cofactors of another row equivalent to a determinant with 2 same rows

linear algebra

This property was listed in my Mathematics Textbook, with a very unintuitive and hard to understand proof :

If elements of a row (or column) are multiplied with cofactors of any other row (or column), then their sum is zero.

I have attempted so far writing down matrices and testing this property out, reading and researching various answers about this property, yet I am not able to intuitively understand it.

Several answers around the Internet state that multiplying a row with corresponding cofactors of ANOTHER row is identical to calculating the determinant of matrix containing 2 identical rows. This is precisely the part I do not understand

Why intuitively is it equivalent to a determinant of a matrix containing two identical rows? I can tell that the property holds true if I test it on any matrix on paper, and that it is indeed equivalent to calculating the determinant of a matrix with two identical rows, but I can't intuitively understand WHY it is so and why such a result happens?

Is there a way to let me understand why it is like that?

Best Answer

Let's consider the matrix:

$$A = \begin{bmatrix}a & b & c\\d & e & f\\g & h & i\end{bmatrix}$$

The cofactors along the first row are:

$$C_{1,1} = \begin{vmatrix}e & f\\h & i\end{vmatrix}$$ $$C_{1,2} = -\begin{vmatrix}d & f\\g & i\end{vmatrix}$$ $$C_{1,3} = \begin{vmatrix}d & e\\g & h\end{vmatrix}$$

And we have that:

$$\det(A) = aC_{1,1} + bC_{1,2} + cC_{1,3}$$

But now consider the equation:

$$dC_{1,1} + eC_{1,2} + fC_{1,3}$$

That would be the cofactor expansion of the matrix: $$B = \begin{bmatrix}d & e & f\\d & e & f\\g & h & i\end{bmatrix}$$

Since that matrix has the same cofactors along the first row as $A$.

But since $B$ has two identical rows, we know that its determinant is zero, so:

$$\det(B) = dC_{1,1} + eC_{1,2} + fC_{1,3} = 0$$

I think the key idea is that, since the cofactors computed along a row (or column) do not use the values of that row (or column), then replacing that row (or column) does not change the cofactors computed along it.

That is why the linear combination $$\sum_{j=1}^{n} a_{k,j} C_{i,j}$$ is the same as the determinant of matrix $A$ with row $i$ replaced with row $k$.

EDIT adding additional details.

The question was about why it is true that:

"If elements of a row (or column) are multiplied with cofactors of any other row (or column), then their sum is zero."

To show why that is true, I first chose a row for the cofactors (in the example above, row 1), then a different row to multiply them by (in the example above, row 2).

Then I showed that the product is the same as the determinant of a different matrix (matrix $B$) in which row 1 is replaced by row 2.

So $B$ was not some random matrix. It was a matrix determined by our choices of

  1. the row (or column) along which to compute the cofactors, which I'll call $i$, and
  2. the other row (or column) which we want to multiply those cofactors by, which I'll call $j$

Once we have chosen those two rows (or columns), we can see that the product of the cofactors for row (or column) $i$ by row (or column) $j$ is the same as the cofactor expansion of a different matrix, in which row (or column) $i$ is replace by row (or column) $j$, so that it has two copies of that row (or column): one in row (or column) $i$ and another in row (or column) $j$.

Hence, that matrix has a determinant of zero, so the equation is equal to zero.