Prove that an alternating multilinear function is a constant times $\det$ without the Leibniz formula and without the sign of permutations.

determinantlinear algebramatrices

Let $\mathbb{A}$ be one of
$\mathbb{Z}$ (the set of all integers),
$\mathbb{Q}$ (the set of all rational numbers),
$\mathbb{R}$ (the set of all real numbers),
or $\mathbb{C}$ (the set of all complex numbers).
(You can of course see $\mathbb{A}$ as an arbitrary commutative ring with unity.)

Let $\mathbb{A}^{m \times n}$ be the set of all $m \times n$ matrices whose entries are in $\mathbb{A}$.

The $(i, j)$-entry of $A$ is denoted as $[A]_{i,j}$.

Suppose that $a_1$, $a_2$, $\dots$, $a_n$ are $m \times 1$ matrices. Then $[a_1, a_2, \dots, a_n]$ is the $m \times n$ matrix whose $(i, j)$-entry is equal to $[a_j]_{i,1}$. The piece of notation allows one to display a matrix using its columns.

Determinants are defined here.


I wish to prove the following theorem:

Theorem D. Suppose that $f$: $\mathbb{A}^{n \times n} \to \mathbb{A}$ has the following two properties:

  • (alternating property) Suppose that $a_1$, $a_2$, $\dots$, $a_n$ are in $\mathbb{A}^{n \times 1}$. If there exist two distinct integers $i$, $j$ such that $a_i = a_j$, then $f ({[a_1, a_2, \dots, a_n]}) = 0$.
  • (multilinear property) Suppose that $j$ is a positive integer less than or equal to $n$. Suppose that the $n-1$ matrices $a_1$, $\dots$, $a_{j-1}$, $a_{j+1}$, $\dots$, $a_n$ are in $\mathbb{A}^{n \times 1}$. Suppose that $x$, $y$ are in $\mathbb{A}^{n \times 1}$. Suppose that $s$, $t$ are in $\mathbb{A}$. Then
    $$ \begin{aligned}
    & f
    {([a_1, \dots, a_{j-1}, sx + ty,
    a_{j+1}, \dots, a_n])}
    \\
    = {} &
    s
    f {([a_1, \dots, a_{j-1}, x, a_{j+1}, \dots, a_n])}
    +
    t
    f {([a_1, \dots, a_{j-1}, y, a_{j+1}, \dots, a_n])}.
    \end{aligned} $$

Then $f(A) = f(I) \det {(A)}$ for any $A \in \mathbb{A}^{n \times n}$, in which $I$ is the $n \times n$ identity matrix.

It is well known that if $f$ has the alternating property, the multilinear property and the property that $f(I) = 1$, then $f$ is just the determinant function; here is a proof based on the Leibniz Formula
$$
\det {(A)} =
\sum_{\sigma \in S_n} \operatorname{sgn} {(\sigma)} \prod_{j = 1}^n [A]_{\sigma(j),j},
$$

in which $\operatorname{sgn}$ is the sign function of permutations in the permutations group $S_n$. One is able to learn from the proof that if $f$ has the alternating property and the multilinear property, then $f$ must be $f(I)$ times the determinant function, which proves Theorem D.

One can also find a proof of Theorem D here.

One can also find a proof in some linear algebra textbooks.

However, all the proofs that I have seen make use of the Leibniz formula. I wonder whether it is possible to prove the result without the Leibniz formula and without the sign of permutations, so that Theorem D can be introduced in elementary textbooks or courses of linear algebra.


Theorem D is powerful: one can use it to show that $\det {(AB)} = \det {(A)} \det {(B)}$ for any $A$, $B \in \mathbb{A}^{n \times n}$ in an easy manner; one can use it to show that the determinant of the block matrix
$$
\begin{bmatrix}
A & C \\
0 & B \\
\end{bmatrix},
$$

in which $A \in \mathbb{A}^{m \times m}$, $B \in \mathbb{A}^{n \times n}$ and $C \in \mathbb{A}^{m \times n}$, equals $\det {(A)} \det {(B)}$, in an easy manner.

Here is a short proof. Define
$$ f(A) =
\det {\begin{bmatrix}
A & C \\
0 & B \\
\end{bmatrix}}$$

for any $A \in \mathbb{A}^{m \times m}$. One can verify that $f$ has the alternating property and the multilinear property (because of the same properties of the determinant function), so
$$ f(A) =
\det {\begin{bmatrix}
I & C \\
0 & B \\
\end{bmatrix}} \det {(A)}.$$

Using the alternating property and the multilinear property, one can show that if $Q$ is the square matrix obtained from the square matrix $P$ by multiplying a column by the scalar $k$ and then adding it to another column, then the determinant of $Q$ is equal to that of $P$. Hence
$$ f(A) =
\det {\begin{bmatrix}
I & 0 \\
0 & B \\
\end{bmatrix}} \det {(A)}.$$

Define
$$ g(B) =
\det {\begin{bmatrix}
I & 0 \\
0 & B \\
\end{bmatrix}}$$

for any $B \in \mathbb{A}^{n \times n}$. One can verify that $g$ has the alternating property and the multilinear property, so
$$ g(B) =
\det {\begin{bmatrix}
I & 0 \\
0 & I \\
\end{bmatrix}} \det {(B)} = \det {(B)}.$$

Hence
$$ \det {\begin{bmatrix}
A & C \\
0 & B \\
\end{bmatrix}} = \det {(A)} \det {(B)}.$$


My attempt is just the same as the proofs that I have seen:

Suppose that column $j$ of the $n \times n$ identity matrix $I$ is $e_j$.
Choose any $A \in \mathbb{A}^{n \times n}$.
Suppose that column $j$ of $A$ is $a_j$.
One may write
$$
a_{k}
= [A]_{1,k} e_{1} + [A]_{2,k} e_{2}
+ \dots + [A]_{n,k} e_{n}
= \sum_{i_k = 1}^{n} {[A]_{i_k,k} e_{i_k}}.
$$

Hence
$$
\begin{align*}
f(A)
= {} &
f([a_1, a_2, \dots, a_n])
\\
= {} &
f\left(\left[ \sum_{i_1 = 1}^{n} {[A]_{i_1,1} e_{i_1},
a_2, \dots, a_n} \right]\right)
\\
= {} &
\sum_{i_1 = 1}^{n} {[A]_{i_1,1}\,
f([ e_{i_1}, a_2, \dots, a_n ])}
\\
= {} &
\sum_{i_1 = 1}^{n} {[A]_{i_1,1}\,
f\left(\left[ e_{i_1},
\sum_{i_2 = 1}^{n} [A]_{i_2,2} e_{i_2},
\dots, a_n
\right]\right)}
\\
= {} &
\sum_{i_1 = 1}^{n} {
\sum_{i_2 = 1}^{n} {[A]_{i_1,1} [A]_{i_2,2}\,
f([ e_{i_1}, e_{i_2}, \dots, a_n ])}}
\\
= {} &
\dots \dots \dots \dots
\dots \dots \dots \dots
\dots \dots \dots \dots
\dots \dots \dots \dots
\\
= {} &
\sum_{i_1 = 1}^{n} {
\sum_{i_2 = 1}^{n} {
\dots
\sum_{i_n = 1}^{n} {
[A]_{i_1,1} [A]_{i_2,2} \dots [A]_{i_n,n} \,
f([e_{i_1}, e_{i_2}, \dots, e_{i_n}])}}}.
\end{align*}
$$

Because of the alternating property, one may write
$$
\begin{aligned}
f(A)
= \sum_{\substack{
1 \leq i_1, i_2, \dots, i_n \leq n \\
i_1, i_2, \dots, i_n\,\text{are distinct}
}}
{[A]_{i_1,1} [A]_{i_2,2} \dots [A]_{i_n,n}\,
f([e_{i_1}, e_{i_2}, \dots, e_{i_n}])}.
\end{aligned}
$$

Now I am stuck. The proofs that I have seen use
$$
f([e_{i_1}, e_{i_2}, \dots, e_{i_n}])
= \operatorname{sgn} {
\begin{pmatrix}
1 & 2 & \dots & n \\
i_1 & i_2 & \dots & i_n \\
\end{pmatrix}
} f(I),
$$

which follows from the antisymmetric property, which in turn follows from the alternating property and the multilinear property,
to conclude that
$$
\begin{aligned}
f(A)
= f(I) \sum_{\sigma \in S_n} \operatorname{sgn} {(\sigma)} \prod_{j = 1}^n [A]_{\sigma(j),j}.
\end{aligned}
$$

Finally, one uses the Leibniz formula to show that $f(A)$ is $f(I)$ times $\det {(A)}$.

I appreciate any hints.


A postscript at 05:10 on 2023-06-17:

The question can be solved if one allows the use of column echelon form; here is an answer (note that a matrix with entries in $\mathbb{Z}$ is automatically a matrix with entries in $\mathbb{Q}$). I will accept it if there is no better solution. It is a pity that one cannot "accept" more than one answer.

Best Answer

If you are willing to reduce $A$ to column echelon form, and accept that $\det$ is multilinear and alternating, then there is a proof that is straightforward and avoids the Leibnitz formula.

The idea is simple, just reduce $A$ to upper triangular form.

Suppose the columns of $A$ are $a_k$. We want to compute $f(a_1,...,a_n)$. Note that we have $f(a_1,...,a_k+ \lambda a_p,...,a_n) = f(a_1,...,a_n)$ for any $p \neq k$ and any $\lambda$. Since $\det$ is also multilinear and alternating, we have $\det(a_1,...,a_k+ \lambda a_p,...,a_n) = \det(a_1,...,a_n)$. Similarly, if any two columns are switched, both $f$ and $\det$ change sign.

Now switch columns and add multiples of another column to a column so that the resulting $a_1',...,a_n'$ are in column echelon form. Then we have $f(a_1',...,a_n') = (-1)^k f(a_1,...,a_n)$ and $\det(a_1',...,a_n') = (-1)^k \det(a_1,...,a_n)$, where $k$ is the number of times a pair of columns was switched.

Since $a_1',...,a_n'$ are in column echelon form, we have $f(a_1',...,a_n') = (-1)^k (a_1')_1 ... (a_n')_n f(e_1,...,e_n)$ and similarly, $\det(a_1',...,a_n') = (-1)^k (a_1')_1 ... (a_n')_n $. In particular, we have $f(a_1,...,a_n) = \det(a_1,...,a_n) f(e_1,...,e_n)$.

Related Question