Is higher rank tensor always the product of lower rank tensor

differential-geometrymetric-spacestensor-productstensor-rank

I remember that I saw the definition of tensor somewhere as tensor is an object in $E\otimes F$ for some vector space $E$ and $F$. (here I used an example for rank two.)

But most of the time in physics people say a tensor is an object that obey the transformation of tensor. i.e. $T^{ab}=\Lambda^a_\alpha\Lambda^b_\beta T^{\alpha\beta}$.

Question 1:

Are those two definition equivalent? How to show that every tensor of higher rank is a product of tensors of lower ranks?

I think so because there seemed to be a theorem about it. But it then arise to another question: Consider the usual Minkowski Metric/Minkowski Tensor, in fact, almost any physical realized metric. They usually can be represented by a $4\times 4$ matrix of non zero determinat, which means that they have at least $4$ eigenvectors! By using Eigenvalue decomposition of a matrix. $T=\sum_i \lambda_i |v_i\rangle\langle v_i|$ where $|v_i\rangle\langle v_i|$ represented outer product.

Question 2:

How can the metric having more than one eigenvector if it's in $E\otimes F$? Does that mean the metric was not a tensor at all?(But then how comes it transform like a tensor?)

Best Answer

As you have observed, tensors come up in many different contexts, and in each they wear many different hats, and can be defined in many different ways. (More mathematically, there are many concrete constructions which give rise to naturally isomorphic sets of objects which we call tensors.) Wikipedia provides the three most common definitions. Here are what we might use for tensors over a single real vector space $V$ with finite dimension $n$.

  • An object which associates with each basis a collection of numbers $T^{i_1\dots i_k}_{j_1\dots j_l}$ obeying a particular transformation law.
  • A multilinear map $T:V^*\times\dots\times V^*\times V\times\dots\times V\to\mathbb{R}$
  • An object in the abstract tensor product space $V\otimes\dots\otimes V\otimes V^*\otimes\dots\otimes V^*$

To show that these are equivalent, if we choose a basis $\mathbf{e}_1,\dots,\mathbf{e}_n$ for $V$ and dual basis $\mathbf{e}^1,\dots,\mathbf{e}^n$ for $V^*$, we can show that every abstract tensor product $\hat{T}$ can be associated with a collection of numbers $T^{i_1\dots i_k}_{j_1\dots j_l}$ (with summation convention) and a multilinear function $\bar{T}$ by $$ \hat{T}=T^{i_1\dots i_k}_{j_1\dots j_l}\mathbf{e}_{i_1}\otimes\dots\otimes\mathbf{e}_{i_k}\otimes\mathbf{e}^{j_1}\otimes\dots\otimes\mathbf{e}^{j_l} $$ $$ T^{i_1\dots i_k}_{j_1\dots j_l}=\bar{T}\left(\mathbf{e}^{i_1},\dots,\mathbf{e}^{i_k},\mathbf{e}_{j_1},\dots,\mathbf{e}_{j_l}\right) $$ And the above formulas gives isomorphisms of vector spaces.

The question "can any tensor be written as a single product of lower-rank tensors" (in particular rank 1) is often of interest (e.g. pure vs. mixed states in quantum mechanics), and the answer is generally no.

As a simple counterexample, one can ask if every linear operator ($(1,1)$ tensor) can be written as a tensor product of a vector and covector ($(1,0)$ and $(0,1)$ tensor respctively). This is false in dimension $>1$. Choosing a basis, the question is if any matrix $A^i_j$ can be written as a product of row and column vectors $A^i_j=v^i\omega_j$. Constructing a counterexample should not be too difficult here.