Abstract Algebra – Prove the Cross Product Doesn’t Satisfy Generalized Associativity

abstract-algebraassociativitycombinatoricscross product

It's well known that the cross product in $\mathbb{R}^3$ doesn't obey the associative law of

$$ A \times (B \times C) = (A \times B) \times C $$

We can define a "Generalized Associative Law" as an expression involving equation two sets of $N$ same ordered symbols turned into a binary expression tree via non-redundant parenthesis. An example in 4 symbols of such a generalized associativity could be:

$$ A \times (B \times (C \times D)) = (A \times B) \times (C \times D) $$

After performing a computerized search I was surprised to find that there is NO generalized associative law for the cross product at least up to 15 symbols. For the interested reader the number of such parenthesized expressions on $N$ symbols grows according to the Catalan Numbers

While I can try optimizing my code and increasing its performance to check against more symbols I naturally have to ask:

If the cross product doesn't have ANY generalized associative law. How would we even go about proving it?

Best Answer

Let $p(v_1, \dots v_n)$ and $q(v_1, \dots v_n)$ be two parenthesizations of the cross product of $n$ vectors. First note that if $p$ and $q$ have any "bottom-level" cross products $v_i \times v_{i+1}$ in common (e.g. your examples have $C \times D$ in common) then since the cross product is surjective, $p = q$ iff $p = q$ with that bottom-level cross product replaced with a single vector (e.g. in your example we can replace $C \times D$ with just $C$). So we can assume WLOG that $p$ and $q$ have no bottom-level cross products in common.

Now let $v_i \times v_{i+1}$ be a bottom-level cross product in $p$, which by our WLOG assumption is not a bottom-level cross product in $q$. This means if we set $v_i = v_{i+1}$ then $p = 0$. Now it suffices to show that we can set the rest of the vectors $v_j$ such that $q \neq 0$.

Here's a sketch of how to do this (this should really include a tree diagram, my apologies). By the surjectivity of the cross product, we can WLOG replace any bottom-level cross product $v_j \times v_{j+1}$ in $q$ which does not involve either $v_i$ or $v_{i+1}$ with a single vector. After we perform this replacement as many times as possible, $q$ must only have two bottom-level cross products, namely $v_{i-1} \times v_i$ and $v_{i+1} \times v_{i+2}$. If we denote the standard basis of $\mathbb{R}^3$ by $e_1, e_2, e_3$ then we can set $v_i = v_{i+1} = e_1, v_{i-1} = -e_2, v_{i+2} = -e_3$ so that

$$v_{i-1} \times v_i = e_3, v_{i+1} \times v_{i+2} = e_2.$$

Now the structure of $q$ is that it takes the cross product of each of these with some other vectors, namely $v_{i-2}$ etc. on the left and $v_{i+3}$ etc. on the right, before eventually taking the cross product of the two resulting vectors, then taking some more cross products with some other vectors on the left and right. This last set of cross products is irrelevant; it suffices to show that the cross product of the subexpression involving $v_i$ and the subexpression involving $v_{i+1}$ can be taken to be nonzero. By induction we can always choose $v_j, j < i-1$ from the set $B = \{ \pm e_1, \pm e_2, \pm e_3 \}$ such that the subexpression involving $v_i$ is in $B$ by just repeatedly choosing such that the current cross product is nonzero, e.g. with the above choices so that $v_{i-1} \times v_i = e_3$ we could choose $v_{i-2} = e_2$ so that $v_{i-2} \times (v_{i-1} \times v_i) = e_1$; moreover at any step we always have two choices up to sign. The same is also true of the $v_j, j > i+1$ and of the subexpression involving $v_{i+1}$ which means that for both subexpressions, at the point where we make the final choice we have two choices on each side and so we can always guarantee that they are different up to sign, hence orthogonal, so that their cross product is a nonzero element of $B$.

Related Question