Linear Algebra – Proving the Alternating Property of the Determinant Function

determinantinductionlinear algebramatrices

Suppose $A$ is a matrix. The $(i, j)$-entry of $A$ is denoted as $[A]_{i,j}$.

Suppose that $a_1$, $a_2$, $\dots$, $a_n$ are $m \times 1$ matrices. Then $[a_1, a_2, \dots, a_n]$ is the $m \times n$ matrix whose $(i, j)$-entry is equal to $[a_j]_{i,1}$. The piece of notation allows one to display a matrix using its columns.

Determinants are defined here.

How does one prove the alternating property?

Suppose that $q$ is a positive integer less than or equal to $n$. Suppose that $p$ is a positive integer less than $q$. Suppose that $a_1$, $a_2$, $\dots$, $a_n$ are $n \times 1$ matrices. Suppose that $a_p = a_q$. Then
$$ \det {[a_1, a_2, \dots, a_n]} = 0. $$

Let the proposition $P(n)$ be as follows:

For any $n \times 1$ matrices $a_1$, $a_2$, $\dots$, $a_n$, any positive integer $q$ less than or equal to $n$, any positive integer $p$ less than $q$, that $a_p = q_q$ implies that
$$ \det {[a_1, a_2, \dots, a_n]} = 0. $$

It is easy to check that $P(2)$ is true:
$$
\det {\begin{bmatrix}
a & a \\
b & b \\
\end{bmatrix}}
= ab – ba
= 0.
$$

Suppose that $P(n-1)$ is true (in which $n \geq 3$). I need to show that $P(n)$ is true.

I know that it suffices to prove that the determinant of a square matrix with two adjacent equal columns is zero. Suppose that one has proved it. Using the multilinear property, one can prove that swapping two adjacent columns changes the sign of the determinant:
$$
\begin{align*}
0
= {} &
\det {[\dots, \overset{\text{column}\,j}{b_j + b_{j+1}}, b_j + b_{j+1}, \dots]}
\\
= {} &
\det {[\dots, b_j, b_j + b_{j+1}, \dots]}
+ \det {[\dots, b_{j+1}, b_j + b_{j+1}, \dots]}
\\
= {} &
\hphantom{{} + {}}
(
\det {[\dots, b_j, b_j, \dots]}
+
\det {[\dots, b_j, b_{j+1}, \dots]}
)
\\
&
+
(
\det {[\dots, b_{j+1}, b_j, \dots]}
+
\det {[\dots, b_{j+1}, b_{j+1}, \dots]}
)
\\
= {} &
\det {[\dots, b_j, b_{j+1}, \dots]}
+ \det {[\dots, b_{j+1}, b_j, \dots]},
\end{align*}
$$

in which $b_1$, $b_2$, $\dots$, $b_n$ are $n \times 1$ matrices.

Suppose that $p < q$. Suppose that column $p$ of the $n \times n$ matrix $A$ is equal to column $q$ of $A$. Let
$$
B = [\dots,
\overset{\text{column}\,p-1}{a_{p-1}},
\overset{\text{column}\,p}{a_{p+1}},
\dots,
a_{q-1},
\overset{\text{column}\,q-1}{a_p},
\overset{\text{column}\,q}{a_q},
a_{q+1},
\dots],
$$

which can be obtained from $A$ by swapping two adjacent columns $q-p\color{red}{{}-1}$ times. Hence
$$
0 = \det {(B)} = (-1)^{q-p\color{red}{{}-1}} \det {(A)},
$$

which means that $\det {(A)} = 0$.

However, I do not know how to prove the special case.

Best Answer

This proof is direct, which is based on the fact that the determinant of a square matrix can be computed by expansion about any column.

I will prove that $P(n)$ (in the description of the question) is true for $n = 2$, $3$, $\dots$ by mathematical induction.

It is easy to check that $P(2)$ is true.

Suppose that $P(n-1)$ is true (in which $n \geq 3$). I will prove that $P(n)$ is true under the hypothesis.

Choose a positive integer $q$ less than or equal to $n$. Choose a positive integer $q$ less than $p$. Suppose that column $p$ of the $n \times n$ matrix $A$ is equal to column $q$ of $A$.

Choose a positive integer $u$ less than or equal to $n$ so that $u \neq p$ and that $u \neq q$. Note that $A(i|u)$ has two equal columns (for $i = 1$, $2$, $\dots$, $n$). By the hypothesis, $\det {(A(i|u))} = 0$. Hence $$ \begin{aligned} \det {(A)} = {} & \sum_{i = 1}^{n} {(-1)^{i+u} [A]_{i,u} \det {(A(i|u))}} \\ = {} & \sum_{i = 1}^{n} {(-1)^{i+u} [A]_{i,u} \,0} \\ = {} & 0. \end{aligned} $$

Hence by mathematical induction, $P(n)$ is true for $n = 2$, $3$, $\dots$.