Suppose $V,W$ are vector spaces with bases $\{e_1,e_2,\dots ,e_m\}$ and $\{f_1,f_2,\dots ,f_n\}$. I know that $V\otimes W=F(V\times W)/H$, where $F(V\times W)$ is the free vector space on $V\times W$ and $H$ is the subspace generated by all elements satisfying certain relations such that bi-linearity holds in the quotient space. I'm writing $v\otimes w$ for the congruency class of the pair $(v,w)\in V\times W$. I can easily show that the pairs $e_i\otimes f_j$ span $V\otimes W$, but I'm having trouble showing linear independence i.e. that if $\sum_{i,j}\lambda_{ij}e_i\otimes f_j=0$ then for all $i,j$, $\lambda_{ij}=0$. Could anybody point me in the right direction? Thanks for any replies!
[Math] Basis of tensor product of vector spaces
abstract-algebratensor-products
Related Solutions
$\newcommand\P{\mathbb{P}}$Let $V$ and $W$ be complex vector spaces, and let $\P(V)$, $\P(W)$ and $\P(V\otimes W)$ be the projective spaces attached to $V$, $W$ and $V\otimes W$, respectively. If $v\in V$ is non-zero, I'll denote by $[v]$ the point of $\P(V)$ corresponding to it; it is the equivalence class of $v$ in $V\setminus0$ for the equivalence relation of linear dependence.
Since decomposability of a tensor does not change when we multiply it by a non-zero scalar, we can talk about the indecomposable elements of $\P(V\otimes W)$. Your question is therefore more or less equivalent to
how can we describe the set of indecomposable elements of $\P(V\otimes W)$?
Now, there is a map $f:\P(V)\times\P(W)\to\P(V\otimes W)$ which maps $([v],[w])$ to $[v\otimes w]$. This is a map of projective varieties (in the sense of algebraic geometry) and its image is precisely the set of indecomposable tensors. The image is in fact a subvariety of $\P(V\otimes W)$, which means that it is the common zero set of a finite set of polynomials. Finding these polynomials is a classical problem solved long ago; see Segre embedding for more information (most introductions to algebraic geometry will say something as well).
In the special case where $\dim V=3$ and $\dim W=2$, with bases $\{x_1,x_2,x_3\}$ and $\{y_1,y_2\}$, we want the coefficients of a tensor $$\sum_{\substack{1\leq i\leq 3\\1\leq j\leq2}}f_{i,j}x_i\otimes y_j$$ to be equal to a product $$\Bigl(\sum_{1\leq i\leq 3}v_ix_i\Bigr)\otimes\Bigl(\sum_{1\leq j\leq 2}w_iy_i\Bigr).$$
It is easy to see that we must have $$f_{i,j}f_{k,l}=f_{k,l}f_{i,l}$$ for all $i,k\in\{1,2,3\}$ and all $j,l\in\{1,2\}$ for that to happen, and some work will show that these conditions are in fact sufficient. We can express all these conditions by saying that the matrix $$\begin{pmatrix}f_{1,1}&f_{1,2}\\f_{2,1}&f_{2,2}\\f_{3,1}&f_{3,2}\end{pmatrix}$$ has rank $1$. Proving this is «just» linear algebra.
The answer in the general case where the dimensions are arbitrary is of the same spirit.
N.B.: it is interesting to know that the question «which tensors have rank $k$?» when $k\geq2$ and there are more than two factors is much, much harder, and very important—I think this is unsolved in general. Someone who knows algebraic geometry might be able to tell us.
Tensor product distributes over addition. That is, $$(a + b)\otimes c = a\otimes c + b\otimes c\quad \text{and}\quad a\otimes (b + c) = a\otimes b + a\otimes c.$$ Furthermore, for any real scalar $\lambda$, we have $$(\lambda a)\otimes b = a\otimes (\lambda b) = \lambda(a\otimes b).$$ These rules will allow you to write $x\otimes y$ as a linear combination of $e_i\otimes f_j$.
You can consider this type of calculation in a more general setting. If $x \in \mathbb{R}^m$ and $y \in \mathbb{R}^n$, their tensor product $x\otimes y$ is sometimes called their outer product. If $e_i\otimes f_j$ is the basis for $\mathbb{R^m}\otimes\mathbb{R}^n$ obtained from the standard bases of $\mathbb{R}^m$ and $\mathbb{R}^n$, then we have the following expression for the outer product:
$$x\otimes y = \sum_{i=1}^m\sum_{j=1}^na_{ij}e_i\otimes f_j$$
where $a_{ij}$ is the $(i, j)^{\text{th}}$ entry of the $m\times n$ matrix $xy^T$.
I suggest you simplify using the rules I gave at the beginning. Once you have the solution using that method, compute the matrix $xy^T$ and check that you get the same coefficients.
Best Answer
$\def\Hom{\mathrm{Hom}}$ So, since you already know that the product of the bases spans $V \otimes W$, you just need to show that the dimension of $V \otimes W$, which is the dimension of its dual, is the product of the dimension of $V$ and the dimension of $W$.
We first reduce the problem to showing that $$(V \otimes W)^{*} = \Hom_{k}(V \otimes W, k) \cong \Hom_{k}(V, \Hom_{k}(W, k)) = \Hom_{k}(V, W^{*}).$$
Note that if the above relationship holds, then the dimension of $V \otimes W$ is the dimension of the space of linear maps from $V$ to $W^{*}$, which is just the product of the dimensions of $V$ and $W^{*}$, and is hence the product of the dimensions of $V$ and $W$.
So, to show the above equalities, note that the leftmost and rightmost equalities are by definition of the dual. So we need to show that
$$\Hom_{k}(V \otimes W, k) \cong \Hom_{k}(V, \Hom_{k}(W, k)).$$
This is actually true if $k$ is any ring and $V$ and $W$ are any $k$-modules. This is the adjunction between $\Hom$ and tensor product. But let's go through the details for the vector space case.
Suppose $\phi$ is a linear map from $V \otimes W$ to $k$. We want to define $f(\phi)$ which takes a vector in $v$ and spits out a linear map from $W$ to $k$. To do so, we define
$$(f(\phi) (v)) (w) = \phi(v \otimes w).$$
You should check that this map is linear and well-defined. It is not too hard. We next construct an inverse map. Given $\psi$ which to each $v$ assigns a linear map from $W$ to $k$, we define $g(\psi)$ as a linear map from $V \otimes W$ to $k$ by first defining it on the simple tensors as $$g(\psi)(v \otimes w) = \psi(v)(w)$$ noting that this is bilinear in the $v$ and $w$ and then noting that bilinear maps extend to all of $V \otimes W$.
Again, $g$ can be checked to be linear and you can also easily see that $f$ and $g$ are inverses.