Exterior Algebra – Quotient vs. Vector Subspace of Tensor Algebra

abstract-algebraexterior-algebragraded-algebraslinear algebravector-spaces

I am writing an expository paper, and in it I defined $\Lambda^k(V)$ as the subspace of alternating tensors of order $k$, i.e. as a vector subspace of $V^{\otimes^k}$, where $V$ is a $\mathbb{K}=\mathbb{C},\mathbb{R}$-linear vector space. I then defined the exterior algebra with the wedge product as the graded algebra:

$$\Lambda(V)=\bigoplus_{k=0}^n\Lambda^k(V)$$

I am now developing the exterior algebra of vector space as a quotient of the tensor algebra:
$$T(V)=\bigoplus_{n=0}^\infty V^{\otimes ^n}$$
with the ideal $I\subset T(V)$ generated by:
$$\{v\otimes v|v\in V\}$$
My question what is sufficient to show that these are the same algebras? I had already proved that of a graded algebra by an ideal generated by homogenous elements is graded, and that $T(V)/I$ satisfies:

$$T(V)/I=\bigoplus T(V)_k/(I\cap T(V)_k)$$
And furthermore that:

$$T(V)_0/(I\cap T(V)_k=\mathbb{K}\qquad \text{and}\qquad T(V)_1/(I\cap T(V)_1)=V$$

I have also shown that for $k>n$ that $T(V)_k/(I\cap T(V)_k)=\{0\}$, and that multiplication in $T(V)/I$ satisfies:
$$
\begin{align}
[v\otimes v]=&0\\
[v\otimes w]=&-[w\otimes v]
\end{align}
$$

for all $v,w \in V$.

I am however really struggling to show that $T(V)_k/I_k=\Lambda^k(V)$. My thought process
was to write the surjective linear map:

$$\phi:T(V)_k\longrightarrow \Lambda^k(V)$$

which on simple tensors is given by:

$$v_1\otimes\cdots\otimes v_k\longmapsto \sum_{\sigma\in S^k}\text{sgn}(\sigma)v_{\sigma(1)}\otimes \cdots \otimes v_{\sigma(k)}$$
and then show that $I\cap T(V)_k=\ker \phi$, so that I could apply a decomposition theorem I had proven earlier. However, showing that $\ker\phi\subset I\cap T(V)_k$ has proven near impossible. I know it should be true, but in the case where $a\in\ker\phi$ is not a simple tensor I can't seem to figure out the argument.

So my question is this, how do we see that these two constructions are equivalent? Is it necessary to go the route I am going, or am I making life harder for myself? Is it enough to just show that product in $T(V)/I$ satisfies the same properties of the wedge product in $\Lambda(V)$?

Edit:
So basically this how I was doing things earlier, and it was in order to motivated differential forms, so everything was done with a dual basis. Let $V=\mathbb{R}^n$, $\{e_i\}$ be the standard basis, and $\{e^i\}$ be the dual. I defined a simple $k$ form as:

$$e^1\wedge \cdots \wedge e^k=\sum_{\sigma\in S^k}\text{sgn}(\sigma)e^{\sigma(1)}\otimes
\cdots\otimes e^{\sigma(k)} $$

And then I said that the wedge product of $k$ form $\omega$, and $l$ form $\eta$ was given by it's action on vectors $v_1,\dots, v_{k+l}$:
$$(\omega\wedge\eta)(v_1,\cdots,v_{k+l})=\frac{1}{k!l!}\sum_{\sigma\in S^{k+l}}\omega(v_{\sigma(1)},\dots, v_{\sigma(k)})\cdot \eta(v_{\sigma(k+1)},\dots,v_{\sigma(k+l)})$$

I liked these for two reasons, first off, if I let $V=\mathbb{R}^3$, $\omega=e^1\wedge e^2$, then with $v_1=a^ie_i$, $v_2=b^ie_i$, and
$v_3=c^ie_i$:
$$
\begin{align}
(\omega\wedge e^3)(v_1,v_2,v_3)=&\frac{1}{2}\sum_{\sigma\in S^3}\text{sgn}(\sigma)\omega(v_{\sigma(1)},v_{\sigma(2)})e^3(v_{\sigma(3)})\\
=&\frac{1}{2}\left(\omega(v_1,v_2)e^3(v_3)+\omega(v_2,v_3)e^3(v_1)+\omega(v_3,v_1)e^3(v_2)\right.\\
&-\left.\omega(v_2,v_1)e^3(v_3)-\omega(v_1,v_3)e^3(v_2)-\omega(v_3,v_2)e^3(v_1) \right)\\
=&\omega(v_1,v_2)e^3(v_3)+\omega(v_2,v_3)e^3(v_1)+\omega(v_3,v_1)e^3(v_2)\\
=&(a^1b^2-a^2b^1)c^3+(b^1c^2-b^2c^1)a^3+(c^1a^2-c^2a^1)b^3\\
=&\det(v_1,v_2,v_3)
\end{align}
$$

while:
$$
\begin{align}
e^1\wedge e^2\wedge e^3(v_1,v_2,v_3)=&\sum_{\sigma\in S^3}\text{\sgn}(\sigma)e^{\sigma(1)}\otimes e^{\sigma(2)}\otimes e^{\sigma(3)}(v_1,v_2,v_3)\\
=&a^1b^2c^3+b^1c^2a^3+c^1a^2b^3-b^1a^2c^3-a^1c^2b^3-c^1b^2a^3\\
=&\det(v_1,v_2,v_3)
\end{align}
$$

The fact these two line up felt important to me, since $\omega\wedge e^3=e^1\wedge e^2\wedge e^3$, and I am pretty sure if I add a $1/3!$ then these two things will be different, something I did not want. Secondly, when defining an inner product on $T^{0,k}$ tensors, I wanted the restriction to the subspace $\Lambda^k(V)$ to basically have a factor of $k!$ so I could rescale the inner product by $1/k!$ and obtain the standard formula:
$$\langle \omega,\eta\rangle=\frac{1}{k!}\sum_{i_1\cdots i_k}\omega_{i_1 \cdots i_k}\eta^{i_1\cdots i_k}$$

Edit #2:

I see that what I am doing is primarily used in differential geometry, as my wedge product in terms of the Alt, carries a
$(k+l)!/k!l!$ term. Does this factor make the two algebra's not isomorphic to one another? Is there a way around this? And why does:
$$e^1\wedge \cdots\wedge e^k=\sum_{\sigma\in S^k}\text{sgn}(\sigma)e^{\sigma(1)}\otimes \cdots\otimes e^{\sigma(k)}$$
match what I am doing so well? In Lee's smooth manifolds, he has that
$e^1\wedge \cdots \wedge e^k$ is the determinant of the $k\times k$ submatrix consisting of columns of the first $k$ components of $k$ vectors, and the definition above matches that perfectly, but I don’t quite see how to reconcile this with the wedge product as defined above.

I am using this construction as a way to motivate the clifford algebras as a deformation of the wedge product, so it would be quite sad for me if these two constructions are actually incompatible.

Edit 3:

I’m sorry it seems this post has gone all over the place, perhaps I will split it into separate questions, if people think that is wise.

However, I do believe I know how to reconcile what I wrote for for the simple $k$-form. Let $\{e^i\}$ be the dual basis for $V$, we want to show that:
$$
e^{i_1}\wedge \cdots \wedge e^{i_k}=\sum_{\sigma\in S_k}\text{sgn}(\sigma)e^{\sigma(i_1)}\otimes \cdots \otimes e^{\sigma(i_k)}
$$

We proceed by induction, the $1$st case is trivial, so we assume the $k$th case and apply the definition of the wedge product:
$$
\begin{align}
(e^{i_1}\wedge \cdots \wedge e^{i_k})\wedge e^{i_{k+1}}(v_1,\cdots
,v_{k+1})=&\frac{1}{k!}\sum_{\sigma\in S^{k+1}}\text{sgn}(\sigma)e^{i_1}\wedge \cdots \wedge e^{i_k}(v_{\sigma(1)},\cdots, v_{\sigma(k)})\cdot e^{i_{k+1}}(v_{\sigma(k+1)})\\
=&\frac{1}{k!}\sum_{\sigma\in S^{k+1}}\sum_{\tau\in S^k}\text{sgn}
(\sigma)\text{sgn}(\tau)
e^{\tau(i_1)}(v_{\sigma(1)})\cdots e^{\tau(i_k)}(v_{\sigma(k)})\cdot
e^{i_{k+1}}(v_{\sigma(k+1)})
\end{align}
$$

For each $\sigma$ there are $k!$ factorial $\sigma'$'s satisfying $\sigma(k+1)=\sigma'(k+1)$, including $\sigma$. We can then split
$S^{k+1}$ into $k+1$ sets, each consisting of the of the permutations which satisfy the aforementioned property. Denote each set by $A^l$, then our sum can be written as:
$$
\begin{align}
“\text{ }''=&\frac{1}{k!}\sum_{l=1}^{k+1}\sum_{\sigma \in A^l}\sum_{\tau \in S^k}\text{sgn}
(\sigma)\text{sgn}(\tau)
e^{\tau(i_1)}(v_{\sigma(1)})\cdots e^{\tau(i_k)}(v_{\sigma(k)})\cdot
e^{i_{k+1}}(v_{\sigma(k+1)})
\end{align}
$$

Fix an $l$, such that $\sigma(k+1)=j$ then:
$$
\begin{align}
\sum_{\sigma \in A^l}&\sum_{\tau \in S^k}\text{sgn}
(\sigma)\text{sgn}(\tau)
e^{\tau(i_1)}(v_{\sigma(1)})\cdots e^{\tau(i_k)}(v_{\sigma(k)})\cdot
e^{i_{k+1}}(v_{\sigma(k+1)})\\
=&e^{i_{k+1}}(v_j)
\sum_{\sigma \in A_i}\sum_{\tau \in S^k}\text{sgn}
(\sigma)\text{sgn}(\tau)
e^{\tau(i_1)}(v_{\sigma(1)})\cdots e^{\tau(i_k)}(v_{\sigma(k)})\\
\end{align}
$$

It doesn't matter computationally whether we permute the covectors, or the vectors, as summing over either gives us every combination of
$e^{i_j}(v_l)$, where $1\leq j,l\leq k$. Hence, fixing $\tau$ and $\tau'$ in $S_{k}$ we claim that:
$$
\begin{align}
\sum_{\sigma \in A_l}\text{sgn}
(\sigma)&\text{sgn}(\tau)
e^{\tau(i_1)}(v_{\sigma(1)})\cdots e^{\tau(i_k)}(v_{\sigma(k)})\\
=&\sum_{\sigma\in A_l}\text{sgn}(\sigma')
\text{sgn}(\tau')
e^{\tau(i_1)}(v_{\sigma'(1)})\cdots e^{\tau(i_k)}(v_{\sigma'(k)})
\end{align}
$$

In the case where $\text{sgn}(\tau)=\text{sgn}(\tau')$ we have that $\tau$ and $\tau'$ differ by an even amount of swaps, so fore very
$\sigma$, the unique $\sigma'$ which satisfies:
$$e^{\tau(i_1)}(v_{\sigma(1)})\cdots e^{\tau(i_k)}(v_{\sigma(k)})=e^{\tau'(i_1)}(v_{\sigma'(1)})\cdots e^{\tau'(i_k)}(v_{\sigma'(k)}$$
must also differ by an even amount swaps, implying that each term in left sum, is equal to some term in the right sum, implying the claim. A similar argument follows in the case where $\text{sng}(\tau)=-\text{sgn}(\tau')$. Since for each $\tau$ the above equality holds, we have that:
$$
\begin{align}
\sum_{\sigma \in A^l}&\sum_{\tau \in S^k}\text{sgn}
(\sigma)\text{sgn}(\tau)
e^{\tau(i_1)}(v_{\sigma(1)})\cdots e^{\tau(i_k)}(v_{\sigma(k)})\cdot
e^{i_{k+1}}(v_{\sigma(k+1)})\\
=&k!\sum_{\sigma \in A^l}\text{sgn}
(\sigma)e^{i_1}(v_{\sigma(1)})\cdots e^{i_k}(v_{\sigma(k)})\cdot
e^{i_{k+1}}(v_{\sigma(k+1)})
\end{align}
$$

implying that:
$$
\begin{align}
(e^{i_1}\wedge \cdots \wedge e^{i_k})\wedge e^{i_{k+1}}(v_1,\cdots
,v_{k+1})=&\sum_{l=1}^{k+1}\sum_{\sigma \in A^l}\text{sgn}
(\sigma)
e^{i_1}(v_{\sigma(1)})\cdots e^{i_k}(v_{\sigma(k)})\cdot
e^{i_{k+1}}(v_{\sigma(k+1)})\\
=&\sum_{\sigma\in S^{k+1}}\text{sgn}
(\sigma)
e^{i_1}(v_{\sigma(1)})\cdots e^{i_k}(v_{\sigma(k)})\cdot
e^{i_{k+1}}(v_{\sigma(k+1)})\\
=&\sum_{\sigma\in S^{k+1}}\text{sgn}
(\sigma)
e^{\sigma(i_1)}\otimes \cdots \otimes e^{\sigma(i_k)}\otimes
e^{\sigma(i_{k+1})}(v_1,\cdots,v_{k+1})
\end{align}
$$

implying the original claim as $(v_1,\cdots, v_{k+1})$ was an arbitrary set of vectors.

Best Answer

$ \newcommand\sgn{\mathrm{sgn}} \newcommand\alt{\mathrm{alt}} \newcommand\AltExt{\mathop\Lambda} \newcommand\Ext{\mathop{\textstyle\bigwedge}} $Your $\phi$ as defined will not work; the wedge product you get from this is non-associative. You have to introduce the normalization factor: $$ v_1\otimes\dotsb\otimes v_k = \color{red}{\frac1{k!}}\sum_{\sigma\in S^k}\sgn(\sigma)v_{\sigma(1)}\otimes\dotsb\otimes v_{\sigma(k)}. $$ (And its for this reason the the exterior algebra is not isomorphic to a subalgebra of $T(V)$ when working over a fields of nonzero characteristic.)

Let us denote $\Ext V = T(V)/I$ and identify $V$ with its image in $\Ext V$ under the canonical projection. What I think is going to be easiest is to realize that this definition of $\Ext V$ gives us a universal property:

  • Let $A$ be any associative algebra. Then every linear $f : V \to A$ such that $f(v)^2 = 0$ for all $v \in V$ extends uniquely to an algebra homomorphism $f' : \Ext V \to A$ such that $f'(v) = f(v)$ for all $v \in V$.

Then we define a wedge product on $\AltExt(V)$ by defining the alternation map $\alt : T(V) \to \AltExt(V)$ grade-wise $$ \alt(v_1\otimes\dotsb\otimes v_k) = \frac1{k!}\sum_{\sigma\in S^k}\sgn(\sigma)v_{\sigma(1)}\otimes\dotsb\otimes v_{\sigma(k)} $$ whence $\AltExt(V) = \alt(T(V))$ and we define $$ X\wedge Y = \alt(X\otimes Y) $$ for any $X, Y \in \AltExt(V)$. By linearity it suffices to show this is associative on simple tensors. Once thats done $\AltExt(V)$ together with $\wedge$ is an associative algebra, and it is easy to see that the map taking $v \in V$ to itself in $\AltExt(V)$ satifies the premises of the universal property of $\Ext V$; hence there is an algebra homomorphism $\phi : \Ext V \to \AltExt(V)$ preserving vectors. Showing one of surjectivity or injectivity should not be difficult; if you do at least one, then it suffices to argue that $\Ext V$ and $\AltExt(V)$ have the same dimension, and we're done.

Related Question