Outer Product Between Two Vectors of Arbitrary Dimensions

geometric-algebrasouter product

I am currently reading the first chapter of Geometric Algebra for Physicists and while I am quite familiar with abstract definitions of inner products and have quite a bit of abstract linear algebra under my belt, this my first time coming across the notion of a generalized outer product. The computations in 2 and 3 dimensions are easy enough to understand, however in $n$ dimensions I am somewhat confused on how to proceed. The book defines the outer product as:

$$(a\wedge b)_{ij}=a_{[i}b_{j]}$$

Where $[]$ denotes antisymmetrisation, however I am a little confused as to what this means. To me this looks like the entries of a matrix, where the diagonal entries are $0$ since $e_i\wedge e_i=0$ and the off diagonal entries are $\pm a_ib_j$. Is this the correct way to think about the outer product? If not where am I going wrong? Are bivectors just skew symmetric matrices?

Best Answer

Although the outer product thus defined does look like a matrix (because of the notation), it is the expansion of the bivector $a ∧ b$ over the basis $\lbrace e_i ∧ e_j\rbrace \text{ for } 1 \le i \lt j \le n$, which has $\frac{n(n-1)}{2}$ components (while a matrix would have $n²$).

Perhaps part of the confusion stems from the use of the Einstein notation, which although very useful tends to obscure certain aspects. If we write the components of the bivector with the more standard Σ notation, we have

\begin{equation} a ∧ b = \sum_{i=1,n\\j=1,n} a_i e_i ∧ b_j e_j\\ = \sum_{i=1,n\\j=1,n} a_i b_j (e_i ∧ e_j) \end{equation}

in which the $(e_i ∧ e_i)$ terms vanish, and since the terms $(e_j ∧ e_i)$ are the opposite of $(e_i ∧ e_j)$, we can combine them.

More concretely, that gives for instance :

\begin{equation} (a ∧ b)_{12} = \sum_{i=1\\j=2} a_i b_j (e_i ∧ e_j)\\ = a_1 b_2 (e_1 ∧ e_2) + a_2 b_1 (e_2 ∧ e_1)\\ = (a_1 b_2 - a_2 b_1) (e_1 ∧ e_2) \end{equation}

in which, I hope, the antisymmetrisation (denoted in the text as $a_{[i} b_{j]}$, which expands to $a_i b_j - a_j b_i$) appears better.

By the way, this is true in any dimension, the only difference between dimesions 3 and $n$ being the number of components.

Related Question