[Math] Proof that linearly dependent rows/columns of $A \implies \det(A) = 0$

determinantlinear algebramatricesproof-explanation

I have seen in a few textbooks of proofs of $\det(A) = 0$ if the rows/columns of $A$ are linearly dependent from the properties of the determinant, but I have yet to see a proof that $\det(A) = 0$ from the definition of the determinant: $$\det(A) = \sum \text{sgn}(P)a_{1\alpha}a_{2\beta}…a_{n\omega}$$

where $(\alpha, \beta, …, \omega)$ are permutations of the columns of $A$. And since all the properties of $\det$ follow from it's definition, I find that proving $\det(A) = 0$ if the rows/columns of $A$ are linearly dependent from the properties of the determinant, to be somewhat unsatisfactory.

Can someone provide a proof from the Leibniz Expansion definition of the determinant, that I provided above, of the result that $\det(A) = 0$ if the rows/columns of $A$ are linearly dependent?

Best Answer

Thanks to the comments below, this hint has been expanded to a complete solution.

Step 1.

Proposition. Suppose $A$ is a square matrix and $B$ is the matrix defined by interchanging two columns of $A$. Then $$\det A=-\det B.$$

Proof. Consider the definition. every term of $\det B$ in the sum is $(-1)$ times the corresponding term for $\det A$. Thus the whole $\det B$ differs from $\det A$ by $(-1)$.

Step 2.

Proposition.

  1. If $A$ is a square matrix that has two equal columns, then $\det A=0$.
  2. If we multiply a column of $A$ by $\lambda$, then $\det A=\lambda(\det A).$

Proof.

  1. Interchanging the two equal columns of $A$ gives the original matrix $A$. Then from step 1 we have $$\det A=-\det A,$$ which implies that $\det A=0.$
  2. This should be obvious from the definition.

Step 3.

Proposition. Let $A$ be a square matrix. Let $A'$ be obtained from $A$ by adding a multiple of one column to a different column. Then $\det A'=\det A$.

Proof. To convince oneself that the above is right, one first try some explicit calculations. Let's do that for $n=2$ and $n=3$.

For $n=2$, let $A=\begin{pmatrix}x_1&y_1\\x_2&y_2\end{pmatrix}$, and let $$A'=\begin{pmatrix}x_1+\lambda y_1&y_1\\x_2+\lambda y_2&y_2\end{pmatrix}.$$

Now $$\begin{split}\det A'&=(x_1+\lambda y_1)y_2-y_1(x_2+\lambda y_2)\\ &=x_1y_2+\lambda y_1y_2-x_2y_1-\lambda y_1y_2\\ &=x_1y_2-x_2y_1\\ &=\det A.\end{split}$$

For $n=3$, we let $A=\begin{pmatrix}x_1&y_1&z_1\\x_2&y_2&z_2\\x_3&y_3&z_3\end{pmatrix}$. We verify the claim for $$A'=\begin{pmatrix}x_1+\lambda z_1&y_1&z_1\\x_2+\lambda z_2&y_2&z_2\\x_3+\lambda z_3&y_3&z_3\end{pmatrix}.$$

Now $$\begin{split}\det A'= &(x_1+\lambda z_1)y_2z_3+(x_3+\lambda z_3)y_1z_2+(x_2+\lambda z_2)y_3z_1\\ &-(x_3+\lambda z_3)y_2z_1-(x_1+\lambda z_1)y_3z_2-(x_2+\lambda z_2)y_1z_3.\end{split}$$

Collecting terms that contain $\lambda$ and $-\lambda$:

$\begin{cases}\lambda z_1y_2z_3+\lambda z_3y_1z_2+\lambda z_2y_3z_1\\ -\lambda z_1y_2z_3-\lambda z_3y_1z_2-\lambda z_2y_3z_1\end{cases}$

so that those terms are canceled out, and what left is the $\det A$.

For a general proof, let's look again at the $n=3$ case. After one understand this, other $n$'s are just a play of notations.

Let $$A=\begin{pmatrix}x_1&y_1&z_1\\x_2&y_2&z_2\\x_3&y_3&z_3\end{pmatrix}.$$ To maximize our simplification of the messy notations in the definition of determinant, we have chosen different alphabets $x,y,z$ to name each column, so that we don't have to use numbers to name those columns. Furthermore, let's use $\sigma$ to denote the permutation, so that the definition of determinant of $A$ becomes: $$\det A=\sum_{\sigma}(\text{sgn}\sigma)x_{\sigma(1)}y_{\sigma(2)}z_{\sigma(3)},$$ where the sum is over all permutations of $\{1,2,3\}$ (Think of the $1,2,3$ in the $\sigma(1),\sigma(2),\sigma(3)$ as the domain of the function $\sigma$). Using this notation, it is pretty clear what the effect of adding a multiple of one column to another is:

Let $$A'=\begin{pmatrix}x_1+\lambda z_1&y_1&z_1\\x_2+\lambda z_2&y_2&z_2\\x_3+\lambda z_3&y_3&z_3\end{pmatrix}.$$ Then $$\begin{split}\det A' &=\sum_{\sigma}(\text{sgn}\sigma)\left(x_{\sigma(1)}+\lambda z_{\sigma(1)}\right)y_{\sigma(2)}z_{\sigma(3)}\\ &=\sum_{\sigma}(\text{sgn}\sigma)x_{\sigma(1)}y_{\sigma(2)}z_{\sigma(3)}+\sum_{\sigma}(\text{sgn}\sigma)\lambda z_{\sigma(1)}y_{\sigma(2)}z_{\sigma(3)}\\ &=\sum_{\sigma}(\text{sgn}\sigma)x_{\sigma(1)}y_{\sigma(2)}z_{\sigma(3)}+\lambda\sum_{\sigma}(\text{sgn}\sigma)z_{\sigma(1)}y_{\sigma(2)}z_{\sigma(3)}.\end{split}$$

The second summand is, by definition, the determinant of the following matrix

$$\begin{pmatrix}z_1&y_1&z_1\\z_2&y_2&z_2\\z_3&y_3&z_3\end{pmatrix}$$

which is $0$ by our step 2. What left is, again by definition, the determinant of $A$. Thus $\det A'=\det A$, and the proof is complete.

There is really no need for me to write out the general proof for arbitrary $n$, which add nothing to the above proof other than heavy notations. You can write that out by yourself if you wish.

Step 4.

Suppose the columns (or rows, but it's more natural for me to think about columns) of a square matrix $A$, denoted by $(C_1,C_2,\ldots,C_n)$, are linearly dependent. This means that one of $C_i$ can be expressed as a nontrivial linear combination of the other columns. For concreteness, let's say $C_1=\lambda_2C_2+\cdots+\lambda_nC_n$, where not all of $\lambda_2,\ldots,\lambda_n$ are zero. Then since by step 3, adding a multiple of one column to another does not change the determinant of $A$, we can add $-\lambda_nC_n,\ldots,-\lambda_3C_3$ to $C_1$ without changing the determinant. This leaves us $C_1=\lambda_2C_2$, from which $\det A=0$ follows by step 2.