[Math] The definition of Determinant in the spirit of algebra and geometry

determinantlinear algebramatricessoft-question

The concept of determinant is quite unmotivational topic to introduce. Textbooks use such "strung out" introductions like axiomatic definition, Laplace expansion, Leibniz'a permutation formula or something like signed volume.

Question: is the following a possible way to introduce the determinant?


Determinant is all about determing whether a given set of vectors are linearly independent, and a direct way to check this is to add scalar multiplications of column vectors to get the diagonal form:

$$\begin{pmatrix}
a_{11} & a_{12} & a_{13} & a_{14} \\
a_{21} & a_{22} & a_{23} & a_{24} \\
a_{31} & a_{32} & a_{33} & a_{34} \\
a_{41} & a_{42} & a_{43} & a_{44} \\
\end{pmatrix} \thicksim \begin{pmatrix}
d_1 & 0 & 0 & 0 \\
0 & d_2 & 0 & 0 \\
0 & 0 & d_3 & 0 \\
0 & 0 & 0 & d_4 \\
\end{pmatrix}.$$

During the diagonalization process we demand that the information, i.e. the determinant, remains unchanged. Now it's clear that the vectors are linearly independent if every $d_i$ is nonzero, i.e. $\prod_{i=1}^n d_i\neq0$. It may also be the case that two columns are equal and there is no diagonal form, so we must add a condition that annihilates the determinant (this is consistent with $\prod_{i=1}^n d_i=0$), since column vectors can't be linearly independent.

If we want to have a real valued function that provides this information, then we simply introduce an ad hoc function $\det:\mathbb{R}^{n \times n} \rightarrow \mathbb{R}$ with following properties:

  1. $$\det (a_1,\ldots,a_i,\ldots,a_j,\ldots,a_n)=\det (a_1,\ldots,a_i,\ldots,k\cdot a_i+a_j,\ldots,a_n).$$

  2. $$\det(d_1\cdot e_1,\ldots,d_n\cdot e_n)=\prod_{i=1}^n d_i.$$

  3. $$\det (a_1,\ldots,a_i,\ldots,a_j,\ldots,a_n)=0, \space \space \text{if} \space \space a_i=a_j.$$


From the previous definition of determinant we can infer the multilinearity property:

$$[a_1,\ldots,c_1 \cdot u+c_2 \cdot v,\ldots,a_n]\thicksim diag[d_1,\ldots,c_1 \cdot d'_i+c_2 \cdot d''_i ,\ldots,d_n],$$ so $$\det[a_1,\ldots,c_1 \cdot u+c_2 \cdot v,\ldots,a_n]=\prod_{j=1:j\neq i}^n d_j(c_1 \cdot d'_i+c_2 \cdot d''_i)$$ $$=c_1\det(diag[d_1,\ldots, d'_i,\ldots,d_n])+c_2\det(diag[d_1,\ldots, d''_i,\ldots,d_n])$$ $$=c_1\det[a_1,\ldots,u,\ldots,a_n]+c_2\det[a_1,\ldots, v,\ldots,a_n].$$

Note that previous multilinearity together with property $(1)$ gives the property $(2)$, so we know from the literature that the determinant function $\det:\mathbb{R}^{n \times n} \rightarrow \mathbb{R}$ actually exists and it is unique.


Obviously, the determinant offers information how orthogonal a set of vectors is. Thus, with Gram-Schmidt process we can form an orthogonal set of vectors form set $(a_1,\ldots, a_n)$, and by multilinearity and property $(2)$ the absolute value of determinant is the volume of parallelepiped spanned by the set of vectors.

Definition.
Volume of parallelepiped formed by set of vectors $(a_1,\ldots, a_n)$ is $Vol(a_1,\ldots, a_n)=Vol(a_1,\ldots, a_{n-1})\cdot |a_{n}^{\bot}|=|a_{1}^{\bot}|\cdots |a_{n}^{\bot}|$, where $a_{i}^{\bot} \bot span(a_1,\ldots, a_{i-1}).$


This approach to determinant works equally well if we begin with the volume of a parallelepiped (geometric approach) or with the search of invertibility (algebraic approach). I was motivated by the book Linear algebra and its applications by Lax on chapter 5:

Rather than start with a formula for the determinant, we shall deduce it from the properties forced on it by the geometric properties of signed volume. This approach to determinants is due to E. Artin.

  1. $\det (a_1,\ldots,a_n)=0$, if $a_i=a_j$, $i\neq j.$
  2. $\det (a_1,\ldots,a_n)$ is a multilinear function of its arguments, in the sense that if all $a_i, i \neq j$ are fixed, $\det$ is a linear function of the remaining argument $a_j.$
  3. $\det(e_1,\ldots,e_n)=1.$

Best Answer

That seems quite opaque: It's a way of computing a quantity rather than telling what exactly it is or even motivating it. It also leaves completely open the question of why such a function exists and is well-defined. The properties you give are sufficient if you're trying to put a matrix in upper-triangular form, but what about other computations? It also gives no justification for one of the most important properties of the determinant, that $\det(ab) = \det a \det b$.

I think the best way to define the determinant is to introduce the wedge product $\Lambda^* V$ of a finite-dimensional space $V$. Given that, any map $f:V \to V$ induces a map $\bar{f}:\Lambda^n V \to \Lambda^n V$, where $n = \dim V$. But $\Lambda^n V$ is a $1$-dimensional space, so $\bar{f}$ is just multiplication by a scalar (independent of a choice of basis); that scalar is by definition exactly $\det f$. Then, for example, we get the condition that $\det f\not = 0$ iff $f$ is an isomorphism for free: For a basis $v_1, \dots, v_n$ of $V$, we have $\det f\not = 0$ iff $f(v_1\wedge \cdots \wedge v_n) = f(v_1) \wedge \cdots \wedge f(v_n) \not = 0$; that is, iff the $f(v_i)$ are linearly independent. Furthermore, since $h = fg$ has $\bar{h} = \bar{f}\bar{g}$, we have $\det(fg) = \det f \det g$. The other properties follow similarly. It requires a bit more sophistication than is usually assumed in a linear algebra class, but it's the first construction of $\det$ I've seen that's motivated and transparently explains what's otherwise a list of arbitrary properties.

Related Question