[Math] Determinant bundle of a tensor product

algebraic-geometrycommutative-algebradeterminantexterior-algebravector-bundles

Let $X$ be a ringed space (for example, a scheme or a manifold). If $V$ is a locally free $\mathcal{O}_X$-module of rank $n$, then $\mathrm{det}(V) := \Lambda^n V$ is a locally free $\mathcal{O}_X$-module of rank $1$, called the determinant of $V$. Actually $\mathrm{det}$ is a functor. Now I wonder how to give a slick proof of the well-known (?) fact that there is a natural isomorphism
$$\mathrm{det}(V \otimes W) \cong \mathrm{det}(V)^{\otimes m} \otimes \mathrm{det}(W)^{\otimes n},$$
where $V$ is locally free of rank $n$ and $W$ is locally free of rank $m$. For this I would like to construct a map globally and basis-free and then show that it is an isomorphism locally, hence an isomorphism. A typical local generator of $\mathrm{det}(V \otimes W)= \Lambda^{n \times m}(V \otimes W)$ is
$$(v_{11} \otimes w_{11}) \wedge \dotsc \wedge (v_{1m} \otimes w_{1m}) \wedge \dotsc \wedge (v_{n1} \otimes w_{n1}) \wedge \dotsc \wedge (v_{nm} \otimes w_{nm}).$$
To which element of $\Lambda^n(V)^{\otimes m} \otimes \Lambda^m(W)^{\otimes n}$ should we map this?

Note that SE/571839 is a very similar question, but I would like to have an abstract proof like indicated in the last paragraph of the accepted answer. (In fact, I want to prove a similar formula in an arbitrary cocomplete symmetric monoidal $\mathbb{Q}$-linear category, where $V$ is called locally free of rank $n$ if $\Lambda^n V$ is invertible and $\Lambda^{n+1} V = 0$. Here, no local bases are available.)

Best Answer

Incomplete thoughts:

Thinking of the elements as written in an $n \times m$ grid, send it to the element

$\alpha = \bigotimes_{j=1}^m (v_{1j} \wedge \cdots \wedge v_{nj}) \otimes \bigotimes_{i=1}^n (w_{i1} \wedge \cdots \wedge w_{im})$.

(so, putting the $v$'s together in columns and the $w$'s in rows.)

To see that this is well-defined, it suffices to show that it is zero if two adjacent terms in the long wedge product are equal.

For instance if $v_{11} \otimes w_{11} = v_{12} \otimes w_{12}$, then either both are $0$, in which case we're done, or $v_{11} = \lambda v_{12}$, $w_{11} = \frac{1}{\lambda} w_{12}$ for some scalar $\lambda \ne 0$. Then the second equality forces $\alpha = 0$ by the $i=1$ factor on the $W$ part. This covers all the cases where we exchange two elements in the same row. (It also covers cases where we exchange elements in the same column. So maybe it is already sufficient.)

But there are still the "line break" equalities to consider, and I'm not sure how to complete the argument, sorry. (This feels very reminiscent of Fulton's proof of Sylvester's Lemma in his Young Tableaux book, with a clever recursive argument for this last case.)

Edit: Here's a thought. Rather than using "line break" inequalities, we'll go through the entries of the grid in a back-and-forth order. So, first we consider the equalities along the first row. Then, we consider the equality

$$v_{1m} \otimes w_{1m} = v_{2m} \otimes w_{2m},$$

comparing the "last entries in the first two rows". It's clear that $\alpha = 0$ in this case since it is a "column equality" (we use the $j=m$ factor -- the last one -- in the $V$ part). Then we work backwards along row 2, then forward along row 3, and so on. Thus at every step, we are either using a "row equality" or a "column equality" to conclude that $\alpha = 0$, so at the end, we conclude that the expression is alternating in all $n\cdot m$ wedges.

Related Question