Why is determinant not defined separately when working with both vectors and scalars

matricesvector analysisvectors

consider the determinant $$
\begin{vmatrix}
i & j & k \\
1 & 0 & 2 \\
0 & 2 & 5 \\
\end{vmatrix}
$$
that comes when we calculate the cross product $(i+0j+2k)\times(0i+2j+5k)$.But the matrix contains both scalars and vectors,it is not an ordinary matrix $M_{n,n}(\mathbb F)$ where $\mathbb F$ is a field.Becuase here the second and third rows are real numbers but first one are vectors.Then should we define the meaning of this determinant separately?Because if we calculate it,we have,

$i$$
\begin{vmatrix}
0 & 2 \\
2 & 5 \\
\end{vmatrix}
$$-j$$
\begin{vmatrix}
1 & 2 \\
0 & 5 \\
\end{vmatrix}
$$+k$$
\begin{vmatrix}
1 & 0 \\
0 & 2 \\
\end{vmatrix}
$
.Here each multiplication stands for scalar multiplication with a vector not multiplication between scalars.So if we do not define the meaning of determinants for such a matrix which is not HOMOGENEOUS in terms of its entries i.e. all elements not coming from a particular field,then how will it make sense.So how should we define determinant in this case?

Best Answer

Generally, we don't define determinants in this way. This method for computing the cross product something known as a mnemonic: a pattern used for remembering something complex. It's not a coincidence that it works, of course, but at the same time, it's not a real determinant.

If you wanted to define a more general non-homogeneous determinant, then you'll have to look at the reasons why it's well-defined. Between scalars $\Bbb{R}$ and vectors in $\Bbb{R}^3$, we have a multiplication operation $$a \cdot (x_1, x_2, x_3) = (ax_1, ax_2, ax_3).$$ The scalar multiplication operation is inherently asymmetric, so we can also symmetrise the operation by defining $v \cdot a = a \cdot v$ for all $v \in \Bbb{R}^3$ and $a \in \Bbb{R}$, which forces a kind of commutativity.

This scalar multiplication operation acts sort of associatively with scalar multiplication on $\Bbb{R}$, specifically \begin{align*} b \cdot (a \cdot v) &= (ab) \cdot v \\ b \cdot (v \cdot a) &= (b \cdot v) \cdot a \\ v \cdot (ba) &= (v \cdot b) \cdot a . \end{align*} We also have distributivity laws \begin{align*} a \cdot(u + v) &= (a \cdot u) + (a \cdot v) \\ (a + b) \cdot v &= (a \cdot v) + (b \cdot v). \end{align*} All in all, this builds up a partial ring structure on $\Bbb{R}^3 \cup \Bbb{R}$ where not every element of the set can add or multiply to each other, but the associativity, commutativity, and distributivity all hold. Plus we have some partial additive identities (i.e. $0 \in \Bbb{R}$ and $(0, 0, 0) \in \Bbb{R}^3$ each act as identities for the two parts of the set $\Bbb{R} \cup \Bbb{R}^3$), and associated additive inverses.

It's not a ring, because we don't get closure of addition and multiplication. We cannot multiply two vectors, and we cannot add a scalar to a vector. But, provided we avoid performing these operations, the structure is very much ring-like.

Thus, we can define the determinant of a matrix with elements from $\Bbb{R}^3 \cup \Bbb{R}$, so long as we ensure that no vectors product with vectors, or add to scalars. The given determinant is fine, evidently, but these two determinants are not:

$$\begin{vmatrix} i & j & 1 \\ 1 & 2 & 1 \\ -1 & 0 & 3 \end{vmatrix}, \quad \begin{vmatrix} i & 2 & 0 \\ 3 & j & -1 \\ 0 & 2 & k \end{vmatrix}.$$

Related Question