[Math] Tensor product of two vector spaces (finite dimension)

linear-transformationsmultilinear-algebratensor-products

I'm studying the tensor product from wikipedia.

If $V$ has a basis $e_1,\dots,e_m$ and $W$ has a basis $f_1,\dots,f_n$, then the tensor product $V\otimes W$ can be taken to be a vector space spanned by a basis consisting of all pair-wise products of elements from the two bases; each such basis element of $V\otimes W$ is denoted $e_i\otimes f_j$. For any vectors $v = \sum\nolimits_i v_i e_i \in V$ and $w = \sum\nolimits_j w_j f_j \in W,$ there is a corresponding product vector $v\otimes w$ in $V\otimes W$ given by $\sum\nolimits_{ij} v_i w_j (e_i\otimes f_j) \in V\otimes W.$ This product operation $\otimes : V \times W \rightarrow V\otimes W$ is quickly verified to be bilinear.

As an example, letting $V = W = \mathbb{R}^3$, then dim$(V\oplus W)=3+3=6$ and dim$(V\otimes W)=3\cdot3=9$.

Questions: (in general but only finite dimension)

a) $f: V \times W \rightarrow V\oplus W$ could be bijective?

b) $g: V \times W \rightarrow V\otimes W$ is not bijective, since it is not surjective (as i see). Is it injective?

c) $g[V \times W]$ the image of $g$ is a subset of $V\otimes W$. Can we say anything else? Is it a subspace? How can we imagine the vectors in $V\otimes W$ which is an element of $g[V \times W]$ and which is not?

Best Answer

For (a), you actually have more: for any pair of vector spaces, $V \times W$ has a natural vector space structure and there is a natural isomorphism $V \times W \cong V \oplus W$. In fact, one often defines $\oplus$ to mean $\times$.

For (b), the map is almost never injective. For any pair of vectors $v,w$ and scalar $r$, you have $(rv) \otimes w = v \otimes (rw)$, and thus $(rv, w)$ and $(v, rw)$ have the same image. The only exception is when $V$ and $W$ are both zero vector spaces.

For (c), we call the image of $g$ the pure tensors. For any pair of vector spaces, every element of $V \otimes W$ can be written in the form

$$ \sum_{i=1}^n v_i \otimes w_i $$

for some natural number $n$ and sequence of vectors $v_i \in V$ and $w_i \in W$. So, the general tensors are simply linear combinations of the pure ones.

The smallest value of $n$ that can be chosen is called the rank of the tensor. (but be careful, rank is also commonly used in this context to mean something completely different)

This agrees with the notion of rank for matrices, when you view the space of matrices as the tensor product of the space of column vectors with the space of row vectors.

Related Question