Vector Space – Natural Pairings Between Exterior Powers and Its Dual

exterior-algebralinear algebra

Let $V$ be a finite-dimensional vector space over a field $k$, $v_1, \dotsc v_n \in V$ a set of vectors, and $f_1, \dotsc f_n \in V^{\ast}$ a set of covectors. Up to permutation, there seem to be at least two "natural" choices of pairing

$${\bigwedge}^n(V) \otimes {\bigwedge}^n(V^{\ast}) \to k.$$

One is given by extending

$$(v_1 \wedge \dotsb \wedge v_n) \otimes (f_1 \wedge \dotsb \wedge f_n) \mapsto \sum_{\sigma \in S_n} \operatorname{sgn}(\sigma) \prod_{i=1}^n f_i(v_{\sigma(i)}).$$

It can be found, for example, in the notes "Tensor algebras, tensor pairings, and duality" by Brian Conrad, and it has the desirable property that if $e_1, \dotsc e_n$ are part of a basis of $V$ and $e_1^{\ast}, \dotsc e_n^{\ast}$ are the corresponding parts of the dual basis, then $e_1 \wedge \dotsb \wedge e_n$ is dual to $e_1^{\ast} \wedge \dotsb \wedge e_n^{\ast}$.

The other is given by extending

$$(v_1 \wedge \dotsb \wedge v_n) \otimes (f_1 \wedge \dotsb \wedge f_n) \mapsto \frac{1}{n!} \sum_{\sigma \in S_n} \operatorname{sgn}(\sigma) \prod_{i=1}^n f_i(v_{\sigma(i)}).$$

It has the disadvantage of not being defined in characteristic $n$ or less, but of the two, this is the one I know how to define functorially; one functorial construction is detailed in my answer to Signs in the natural map $\Lambda^k V \otimes \Lambda^k V^* \to \Bbbk$ and another is given by starting with the induced pairing $V^{\otimes n} \otimes (V^{\ast})^{\otimes n} \to k$, restricting to antisymmetric tensors $\operatorname{Alt}^n(V) \otimes \operatorname{Alt}^n(V^{\ast}) \to k$, and then inverting the canonical map $\operatorname{Alt}^n(V) \to V^{\otimes n} \to \Lambda^n(V)$ as well as the corresponding map for the dual space (these maps being isomorphisms only in characteristic greater than $n$).

In another math.SE answer, Aaron indicates that the first pairing is a restriction of a more general functorial action of $\Lambda(V^{\ast})$ on $\Lambda(V)$. I initially thought this construction worked almost entirely on the basis of certain universal properties, but Aaron ends up having to check that certain conditions are met by hand; either way, this construction isn't ideal as it doesn't seem to treat $V$ and $V^{\ast}$ symmetrically. So:

(How) can I functorially construct the first pairing in a way that treats $V$ and $V^{\ast}$ symmetrically? Is it the case that the only functorial constructions of the first pairing treat $V$ and $V^{\ast}$ asymmetrically?

(By "functorially construct" I mean, at a minimum, that you don't ever have to specify what happens to pure tensors and extend.)

I put "how" in parentheses because I'm not convinced that this is possible. One way to do it is to fix a nice embedding $\Lambda^n(V) \to \operatorname{Alt}^n(V)$. Let's suppose that we send $v_1 \wedge \dotsb \wedge v_n$ to

$$c_n \sum_{\sigma \in S_n} \operatorname{sgn}(\sigma) v_{\sigma(1)} \otimes \dotsb \otimes v_{\sigma(n)}$$

where $c_n$ depends only on $n$ (in particular, it shouldn't depend on $V$). Then, if I'm not mistaken, the induced pairing $\Lambda^n(V) \otimes \Lambda^n(V^{\ast}) \to k$ is given by extending

$$(v_1 \wedge \dotsb \wedge v_n) \otimes (f_1 \wedge \dotsb \wedge f_n) \to c_n^2 n! \sum_{\sigma \in S_n} \operatorname{sgn}(\sigma) \prod_{i=1}^n f_i(v_{\sigma(i)})$$

so the only way to recover the first pairing is if $c_n = \frac{1}{\sqrt{n!}}$, which is terrible since it depends on this particular element existing in $k$. The above computation reminds me of an issue with the normalization of the discrete Fourier transform on $\mathbb{Z}/n\mathbb{Z}$, the problem being that there are at least two natural measures (counting measure and the unique invariant probability measure) which are Fourier dual under the discrete Fourier transform, and the only one which is Fourier self-dual assigns each point the measure $\frac{1}{\sqrt{n}}$.

The discussion at Is there a preferable convention for defining the wedge product? seems related, although it doesn't seem to immediately answer my question. In the geometric context, I am asking what the natural pairing is between differential forms and polyvector fields. People seem to agree that there is one, but I don't know where to find an authoritative opinion on what it is. In geometric language, I guess want to think of polyvector fields as "infinitesimal cubes" on a manifold, so the correct pairing should be given by "integration."

Can this be made precise, and does it give the first pairing or the second one?

Best Answer

I would like to give some details in order to make clear that one can give a proof with hardly any computations at all (I have never looked at the Bourbaki presentation but I guess they make the same point though, because they want to make all proofs only depend on previous material they might make a few more computations).

To begin with we are considering supercommutative algebras (over a commutative base ring $R$) which are strictly commutative, i.e., $\mathbb Z/2$-graded algebras with $xy=(-1)^{mn}yx$, where $m$ and $n$ are the degrees of $x$ and $y$ and $x^2=0$ if $x$ has odd degree. Note that the base extension of such an algebra is of the same type (the most computational part of such a verification is for the strictness which uses that on odd elements $x^2$ is a quadratic form with trivial associated bilinear form). The exterior algebra is then the free strictly supercommutative algebra on the module $V$. Furthermore, the (graded) tensor product of two sca's is an sca and from that it follows formally that $\Lambda^\ast(U\bigoplus V)=\Lambda^\ast U\bigotimes\Lambda^\ast V$.

Now, the diagonal map $U\to U\bigoplus U$ then induces a coproduct on $\Lambda^\ast U$ which by functoriality is cocommutative making the exterior algebra into superbialgebra. I also want to make the remark that if $U$ is f.g. projective then so is $\Lambda^\ast U $. Indeed, by presenting $U$ as a direct factor ina free f.g. module one reduces to the case when $U$ is free and then by the fact that direct sums are taken to tensor products to the case when $U=R$ but in that case it is clear that $\Lambda^\ast U=R\bigoplus R\delta$ where $\delta$ is an odd element of square $0$.

If now $U$ is still f.g. projective then $(\Lambda^\ast U^\ast)^\ast$ is also a supercommutative and supercocommutative superbialgebra. We need to know that it is strictly supercommutative. This can be done by a calculation but we want to get by with as much handwaving as possible. We can again reduce to the case when $U$ is free. After that we can reduce to the case when $U=R$ (using that the tensor product of sca's is an sca) and there it is clear (and one can even in that case avoid computations). Another way, once we are then free case, is to use the base change property to reduce to the when $R=\mathbb Z$ and then we have seen that the exterior algebra is torsion free and a torsion free supercommutative algebra is an sca.

We also need to know that if $U$ is f.g. projective, then the structure map $U\to\Lambda^\ast U$ has a canonical splitting. This is most easily seen by introducing a $\mathbb Z$-grading extending the $\mathbb Z/2$-grading and showing that the degree $1$ part of it is exactly $U$ (maybe better would have to work from the start with $\mathbb Z$-gradings).

This splitting gives us a map $U\to(\Lambda^\ast U^\ast)^\ast$ into the odd part and as we have an sca we get a map $\Lambda^\ast U\to(\Lambda^\ast U^\ast)^\ast$ of sca's. This map is an isomorphism. Indeed, we first reduce to the case when $R$ is free (this is a little bit tricky if one wants to present $U$ as a direct factor of free but one may also use base change compatibility to reduce to the case when $R$ is local in which case $U$ is always free). Then as before one reduces to $U=R$ where itt is clear as both sides are concentrated in degrees $0$ and $1$ and the map clearly is an iso in degree $0$ and by construction in degree $1$.

Note that everything works exactly the same if one works with ordinary commutativity (so that the exterior algebra is replaced by the symmetric one) up till the verification in the case when $U=R$. In that case the dual algebra is the divided power algebra and we only have an isomorphism in characteristic $0$.

Afterthought: The fact that things work for the exterior algebra but introduces divided powers in the symmetric case has an intriguing twist. One can define divided power superalgebras where all higher powers of odd elements are defined to be zero. With that definition the exterior algebra has a canonical divided power structure (for a projective module) which is characterised by being natural and commuting with base change (or being compatible with tensor products, just as for ordinary divided powers there is a natural divided power on the tensor product of two divided power algebras). Hence, somehow the fact that the exterior algebra is selfdual is connected with the fact that the exterior algebra is also the free divided power superalgebra on odd elements. However, I do not know of any a priori reason why the dual algebra, both in the symmetric and exterior case, should have a divided power structure.

As a curious aside, the divided power structure on the exterior algebra is related to the Riemann-Roch formula for abelian varieties: If $L$ is a line bundle on an abelian variety the RR formula says that $$ \chi(L) = \frac{L^g}{g!} $$ and this gives (as usual for Aityah-Singer index formulas) also the integrality statement that $L^g$ is divisible by $g!$. However, that follows directly from the fact that the cohomology of an abelian variety is the exterior algebra on $H^1$ and the fact that the exterior algebra has a divided power structure.

Related Question