Higher dimensional cross product equivalent

cross productdeterminantlinear algebraorthogonality

I'm working on a computer vision script for high dimensions that is highly reliant on the cross product in 3D, but as far as I know, it is only formally defined in 3D and 7D. However, experimentally, the property of it I specifically need, the ability to take $n-1$ nD vectors and produce a vector which is orthogonal to all of them, seems to be preserved. I'm unsure if the issues with cross product in higher dimensions are a matter of preserving other properties like anticommutativity, distribution over vector addition, etc., or if a problem eventually arises with orthogonality.

The method I've used so far is as follows:
Form a matrix from the $n-1$ vectors, labeled $v_1,\dots,v_n$, given, then insert and solve for $u$ (I know my annotation is a little bad here):
$$\begin{bmatrix} u_1 & \dots & u_n\\ v_{1_1} & \dots & v_{1_n}\\ \vdots & \vdots & \vdots\\ v_{n_1} & \dots & v_{n_n}\end{bmatrix}$$

To solve for $u_i$, we take $(-1)^{i+1}*A_i$, and form a vector of these, where $A_i$ is the minor of the original matrix with respect to row 1, column $i$. This should give us a vector $u$ which is orthogonal to $v_1,\dots,v_n$, and appears to do so experimentally for 4D, but I wouldn't even know where to begin with the general proof, assuming that this is even true for all cases.

Writing out the dot product, I would need something like:
$\sum_{i=1}^n (-1)^{i+1}*A_i*v_{j_i}=0$ for all $v_j$.

Does anyone know off the top of their head if this has already been proven somewhere, disproven somewhere, or if there a more feasible way to get an nD vector orthogonal to $n-1$ other vectors?

Best Answer

Cross product $a \times b$ of two vectors $a, b \in \mathbb{R}^3$ is designed to satisfy

$$\langle a\times b, x\rangle = \det(a,b,x),\qquad\forall x\in\mathbb{R}^3,$$

where $\langle \cdot, \cdot \rangle$ is an inner product. And of course, this relation can be used to prove all the relevant properties of $a \times b$. Likewise, the $n$-dimensional cross product can be defined as a function $\operatorname{Cross}(\cdots)$ of $(n-1)$-vectors in $\mathbb{R}^{n-1}$, given by

$$\langle \operatorname{Cross}(a_1,\cdots,a_{n-1}), x \rangle = \det(a_1,\cdots,a_{n-1},x),\qquad\forall x\in\mathbb{R}^n.$$

This is almost exactly the same as what you have constructed, possibly except for the sign choice. Indeed, the following observations can be extracted from the properties of determinant.

  • $a_1, \cdots, a_{n-1}$ are linearly independent if and only if $\operatorname{Cross}(a_1, \cdots, a_{n-1}) \neq 0$.

  • If $x \in \operatorname{span}(a_1, \cdots, a_{n-1})$, then $\langle \operatorname{Cross}(a_1, \cdots, a_{n-1}), x \rangle = 0$. In particular, $a_1, \cdots, a_{n-1}$ are orthogonal to $\operatorname{Cross}(a_1, \cdots, a_{n-1})$.

  • $\operatorname{Cross}(\cdots)$ is multi-linear, meaning that it is linear in each argument.