Linear Algebra – Does This Formula Have a Rigorous Meaning, or is it Merely Formal?

linear algebrara.rings-and-algebras

I hope this problem is not considered too "elementary" for MO. It concerns a formula that I have always found fascinating. For, at first glance, it appears completely "obvious", while on closer examination it does not even seem well-defined. The formula is the one that I was given as the definition of the cross-product in $\mathbb R^3 $ when I was first introduced to that concept:

$$
B \times C :=
\det
\begin{vmatrix}
{\mathbf i } & {\mathbf j } & {\mathbf k } \\\\
B_1 & B_2 & B_3 \\\\
C_1 & C_2 & C_3\\\\
\end{vmatrix}
$$
On the one hand, if one expands this by minors of the first row, the result is clearly correct—and to this day this is the only way I can recall the formula for the components of the cross-product when I need it. But, on the other hand, the determinant of an $n \times n$ matrix whose elements are a mixture of scalars and vectors is undefined. Just think what happens if you interchange one element of the first row with the element just below it. In fact, as usually understood, for a determinant of a matrix to be well-defined, its elements should all belong to a commutative ring. But then again (on the third hand 🙂 if we take the dot product of both sides of the formula with a third vector, $A$, we seem to get:

$$
A \cdot B \times C =
A \cdot
\det
\begin{vmatrix}
{\mathbf i } & {\mathbf j } & {\mathbf k } \\\\
B_1 & B_2 & B_3 \\\\
C_1 & C_2 & C_3\\\\
\end{vmatrix}
=
\det
\begin{vmatrix}
A_1 & A_2 & A_3 \\\\
B_1 & B_2 & B_3 \\\\
C_1 & C_2 & C_3\\\\
\end{vmatrix}
$$
and of course the left and right hand sides are well-known formulas for the (signed) volume of the parallelepiped spanned by the three vectors, $A, B, C$. Moreover, the validity of the latter formula for all choices of $A$ indicates that the original formula is "correct".

So, my question is this: Is there a rigorous way of defining the original determinant so that all of the above becomes meaningful and correct?

Best Answer

But there is a commutative ring available, along the lines of what Mariano says. If $k$ is a field and $V$ is a vector space, then $k \oplus V$ is a commutative ring by the rule that a scalar times a scalar, or a scalar times a vector, or a vector times a scalar, are all what you think they are. The only missing part is a vector times a vector, and you can just set that to zero. The dot product is then a special bilinear form on the algebra. In the formalism, I think that everything that you wrote makes sense.


Theo says in a comment that "even better", one should work over $\Lambda^*(V)$, the exterior algebra over $V$. The motivation is that this algebra is supercommutative. I considered mentioning this solution, and supposed that I really should have, because it arises in important formulas. For example, the Gauss formula for the linking number between two knots $K_1, K_2 \subseteq \mathbb{R}^3$ is: $$\mathrm{lk}(K_1,K_2) = \int_{K_1 \times K_2} \frac{\det \begin{bmatrix} \vec{x} - \vec{y} \\ d\vec{x} \\ d\vec{y} \end{bmatrix}}{4\pi |\vec{x} - \vec{y}|^3}$$ $$= \int_{K_1 \times K_2} \frac{\det \begin{bmatrix} x_1 - y_1 & x_2 - y_2 & x_3 - y_3 \\ dx_1 & dx_2 & dx_3 \\ dy_1 & dy_2 & dy_3 \end{bmatrix}}{4\pi |\vec{x} - \vec{y}|^3}.$$ The right way to write and interpret this formula is indeed as a determinant in the exterior algebra of differential forms. For one reason, it makes it easy to generalize Gauss' formula to higher dimensions.

However, supercommutative is not the same as commutative, and this type of determinant has fewer properties than a determinant over a commutative ring. And different properties. Such a determinant has a broken symmetry: you get a different answer if you order the factors in each term by rows than by columns. (I am using row ordering.) Indeed, the row-ordered determinant can be non-zero even if it has repeated rows. To give two examples, the determinant in the generalized Gauss formula has repeated rows, and the standard volume form in $\mathbb{R}^n$ is $$\omega = \frac{\det ( d\vec{x}, d\vec{x}, \ldots, d\vec{x} )}{n!}.$$

Happily, for Dick's question, you can truncate the exterior algebra at degree 1, which is exactly what I did. This truncation is both supercommutative and commutative.

Related Question