[Math] Why were matrix determinants once such a big deal

big-listdeterminantsho.history-overviewlinear algebrana.numerical-analysis

I have been told that the study of matrix determinants once comprised the bulk of linear algebra. Today, few textbooks spend more than a few pages to define it and use it to compute a matrix inverse. I am curious about why determinants were such a big deal. So here's my question:

a) What are examples of cool tricks you can use matrix determinants for? (Cramer's rule comes to mind, but I can't come up with much more.) What kind of amazing properties do matrix determinants have that made them such popular objects of study?

b) Why did the use of matrix determinants fall out of favor? Some back history would be very welcome.

Update: From the responses below, it seems appropriate to turn this question into a community wiki. I think it would be useful to generalize the original series of questions with:

c) What significance do matrix determinants have for other branches of mathematics? (For example, the geometric significance of the determinant as the signed volume of a parallelepiped.) What developments in mathematics have been inspired by/aided by the theory of matrix determinants?

d) For computational and theoretical applications that matrix determinants are no longer widely used for today, what have supplanted them?

Best Answer

I don't think that determinants is an old fashion topic. But the attitude towards them has changed along decades. Fifty years ago, one insisted on their practical calculation, by bare hands of course. This way of teaching linear algebra has essentially disappeared. But the theoretical importance of deteminants is still very high, and they are usefull in almost every branch of Mathematics, and even in other sciences. Let me give a few instances where determinants are unavoidable.

  1. Change of variable in an integral. Isn't the Jacobian of a transformation a determinant?
  2. The Wronskian of solutions of a linear ODE is a determinant. It plays a central role in spectral theory (Hill's equation with periodic coefficients), and therefore in stability analysis of travelling waves in PDEs.
  3. A well-known proof of the simplicity of the Perron's eigenvalue of an irreducible non-negative matrix is a very nice use of the multilinearity of the determinant.
  4. The $n$th root of the determinant is a concave function over the $n\times n$ Hermitian positive definite matrices. This is at the basis of many development in modern analysis, via the Brunn-Minkowski inequality.
  5. In combinatorics, determinants and Pfaffians occur in formulas counting configurations of lines between sets of points in network. D. Knuth advocates that there are no determinants, but only Pfaffians.
  6. Of course, the eigenvalues of a matrix are the roots of a determinant, the characteristic polynomial. In control theory, the Routh-Hurwitz algorithm, which checks whether a system is stable or not, is based on the calculation of determinants.
  7. As mentioned by J.M., Slater determinants are used in quantum chemistry.
  8. Frobenius Theory provides an algorithm for classifying matrices $M\in M_n(k)$ up to conjugation. It consists in calculating all the minors of the matrix $XI_n-M\in M_n(k[X])$ (these are determinants, aren't they?), then the g.c.d. of the minors of size $k=1,\ldots,n$ for each $k$. This is the theory of similarity invariants, which are polynomials $p_1,\ldots,p_n$, with $p_j|p_{j+1}$ and $p_1\cdots p_n=P_M$, the characteristic polynomial. If one goes further by decomposing the $p_j$'s (but this is beyond any algorithm), one obtains the theory of elementary divisors.
  9. If $L$ is an algebraic extension of a field $K$, the norm of $a\in L$ is nothing but the determinant of the $K$-linear map $x\mapsto ax$. It is an element of $K$.
  10. Kronecker's principle characterizes the power series that are rational functions, in terms of determinants of Hankel matrices. This has several important applications. One is the proof by Dwork of Weil's conjecture that Zeta functions of algebraic curves are rational functions. Another one is Salem's theorem: if $\theta>1$ and $\lambda>0$ are real numbers, such that the distances of $\lambda\theta^n$ to ${\mathbb N}$ are square summable, then $\theta$ is an algebraic number of class $S$.
  11. Above all, invertible matrices are characterized by their determinant: it is an invertible scalar. This is true when the scalars belong to a unit commutative ring. Besides, the determinant is the unique morphism ${\bf GL}_n(A)\mapsto A^*$ ; it therefore plays the same role in the linear group as that played by the signature in the symmetric group $\frak S_n$.
  12. Powers of the determinant of $2\times2$ matrices appear in the definition of automorphic forms over the Poincaré half-plane.
  13. See also the answers to JBL's question, Wonderful applications of the Vandermonde determinant
  14. In algebraic geometry, most projective curves can be seen as the zero set of some determinantal equality $\det(xA+yB+zC)=0$. The theory was developped by Helton & Vinnikov. For instance, a hyperbolic polynomial in three variables can be written as $\det(xI_n+yH+zK)$ with $H,K$ Hermitian matrices; this was conjectured by P. Lax.
  15. The discriminant of a quadratic form is the determinant of its matrix, say in a given basis. There are two important situations. A) If the scalars form a field $k$, the discriminant is really a scalar modulo the squares of $k^\times$. It is an element of the classification of quadratic forms up to isomorphism. B) Gauss defines a composition rule of two binary forms (say $ax^2+bxy+cy^2$) with integer coefficients when they have the same discriminant. The classes of equivalent forms of given discriminant make an abelian group. In 2014, a Fields medal was awarded to Manjul Bhargawa for major advances in this area.
  16. In a real vector space, the orientation of a basis is the sign of its determinant.
  17. One of the most important PDE, the Monge-Ampère equation writes $\det D^2u=f$. It is central in optimal transport theory.
  18. Recently, I proved the following amazing result. Let $T:{\mathbb R}^d\rightarrow{\bf Sym}_d^+$ be periodic, according to some lattice. Assume that $T$ is row-wise divergence-free, that is $\sum_j\partial_jt_{ij}=0$ for every $i=1,\ldots,d$. Then $$\langle(\det T)^{\frac1{d-1}}\rangle\le\left(\det\langle T\rangle\right)^{\frac1{d-1}},$$ where $\langle\cdot\rangle$ denotes the average of a periodic function. With the exponent $\frac1d$ instead, this would be a consequence of Jensen inequality and point 4 above. The equality case occurs iff $T$ is the cofactor matrix of the Hessian of some convex function.
  19. The Gauss curvature of a hypersurface is the Jacobian determinant of the Gauss map (the map which to a point $x$ associates the unit normal to the hypersurface at $x$).

Of course, this list is not exhaustive (otherwise, it should be infinite). I do teach Matrix Theory, at Graduate level, and spend a while on determinant, even if I rarely compute an exact value.

Edit. The following letter by D. Perrin to J.-L. Dorier (1997) supports the importance of determinants in algebra and in teaching of algebra.