Linear Algebra – Antisymmetric vs. Alternating k-Linear Forms and Wedge Product

determinantexterior-algebralinear algebramultilinear-algebra

In my linear algebra course we derived determinants via alternating $k$-linear forms $f: V^k \to K$. These alternating linear forms form a vector space $\Lambda^k(V^*)$. During this derivation we defined the wedgeproduct
$$a \wedge b := \frac{k + l}{k!l!}\text{Alt}(a \otimes b)$$
for $a \in \Lambda^k(V^*)$ and $b\in \Lambda^l(V^*)$. Further $\text{Alt}(f)$ is an operator which produces an alternating linear form given a (non-alternating) linear form $f$.

  1. My notes say that the wedge product is not necessarily alternating. Moreover, $(a \wedge b) = (-1)^{kl}(b \wedge a)$.
    Why is that correct? $\text{Alt}(\cdot)$ scaled by a scalar should be alternating by default!

  2. In class we did not talk about exterior algebras at all but only used the notions above to derive determinants. This wikipedia mentions anti-symmetric maps (here). What exactly is that? I only know about anti-symmetric matrices. What is the difference between anti-symmetric and alternating maps?

  3. Is it true that an alterating bilinear form $b(x,y) = -b(y,x)$ can always be represented by an anti-symmetric matrix? I proved the converse (anti-symmetric matrix always induces an alternating bilinear form) but cannot find an angle to prove this direction.

Thanks!

Best Answer

(1) $\mathrm{Alt}$ is an alternating function of $k+l$ vectors from $V$, but it is not alternating as functions of the elements $a\in\Lambda^k V$, $b\in \Lambda^l V$. As semiclassical points out in the comments, given the elements $a=u$ and $b=v\wedge w$ we have $a\wedge b=b\wedge a$, even though (actually because) the $\wedge$ operator is alternating in $u,v,w\in V$.

(2) Antisymmetric means "swapping two inputs reverses the sign of the output," whereas alternating means "if two inputs are the same then the output is zero." For multilinear forms, these properties are equivalent ... unless you're working over a field of scalars where $2$ is not invertible (that is, a field of characteristic two).

Let's work this out for two inputs. Suppose $f$ is alternating. So $f(x,x)=0$ for all $x$. In particular this means $f(x+y,x+y)=0$ for all $x$ and $y$. We can "FOIL" this out with the distributive property ($f$ is multilinear), cross out $f(x,x)$ and $f(y,y)$, then rearrange in order to obtain the identity $f(x,y)=-f(y,x)$. Thus, alternating implies antisymmetric.

Does the converse hold? Suppose $f(x,y)=-f(y,x)$ for all $x,y$. Setting $x=y$ and rearranging terms the equation $2f(x,x)=0$. If the characteristic of our field is $\ne2$, then we can divide by $2$ to get $f(x,x)=0$. But otherwise if the characteristic is $2$ this is just $0=0$. Thus, antisymmetric implies alternating but only as long as the characteristic is not two.

(3) Pick a basis for $V$, then compute $b(e_i,e_j)=b_{ij}$. Form a matrix $B$ out of these numbers. Then if we rewrite the elements of $V$ as column vectors (using our basis), $b(x,y)$ is in fact $x^TBy$. Note scalars, interpreted as $1\times 1$ matrices, are symmetric. Thus in the antisymmetric condition, which reads $x^TBy=-y^TBx$, we can transpose the right side to get $x^TBy=-x^TB^Ty$, which in turn may be rearranged to $x^T(B+B^T)y=0$ for all $x,y$. If this is true for basis vectors $x,y$ then all entries of $B+B^T$ are zero, so $B+B^T=0$ is the zero matrix, so $B^T=-B$ is an antisymmetric.