Tensors – Tensor Product and Linear Dependence of Vectors

tensor decompositiontensor-ranktensors

Let $V_1, \ldots, V_k$ be complex vector spaces. Given $k$ vectors $v_1 \in V_1, \ldots, v_k \in V_k$, we define that the tensor product $v_1 \otimes \ldots \otimes v_k$ has rank 1. For any tensor $T \in V_1 \otimes \ldots \otimes V_k$, the rank of $T$ is the minimum $r \in \mathbb{N}$ such that $T$ can be written as a sum of $r$ rank 1 tensors. In this case, there are vectors $v_{1,1}, \ldots, v_{r,1} \in V_1, \ldots v_{1,k}, \ldots, v_{r,k} \in V_k$ such that

$$T = \sum_{i=1}^r v_{i,1} \otimes \ldots \otimes v_{i,k}.$$

I have two questions about the relation between this decomposition and the linear dependency of the vectors.

1) Suppose we have linearly independent vectors $v_{1,j}, \ldots,v_{r,j} \in V_j$, for each $j=1 \ldots k$, and construct the tensor $T = \sum_{i=1}^r v_{i,1} \otimes \ldots \otimes v_{i,k}$. Is it right write to say the rank of $T$ is $r$? If not, what conditions should be considered instead just independence?

2) On the other hand, suppose we know $T$ has rank $r$ and can be written as $T = \sum_{i=1}^r v_{i,1} \otimes \ldots \otimes v_{i,k}$. Is it right to say the vectors $v_{1,j}, \ldots,v_{r,j} \in V_j$, for each $j=1 \ldots k$, are linearly independent?

I'm aware that tensors are not so simple and probably these relations doesn't hold. In this case I'm also accepting suggestions in the following sense:

1) What properties the vectors should have in order to construct a tensor of rank $r$.

2) In the case we already have a tensor of rank $r$ (together with its decomposition), what properties the vectors forming it should have?

Thank you.

Best Answer

These questions can be answered in the case of real spaces, which is enough for my purposes. Anyone interested can check the article Tensor Rank and the Ill-Posedness of the Best Low-Rank Approximation Problem - Vin de Silva and Lek-Heng Lim.

The first one is the lemma 3.5. This answers the first question positively.

Lemma 3.5. For $\ell=1 \ldots k$, let $x_1^{(\ell)}, \ldots, x_r^{(\ell)} \in \mathbb{R}^{d_\ell}$ be linearly independent. Then the tensor defined by $$ \sum_{j=1}^r x_j^{(1)} \otimes x_j^{(2)} \otimes \ldots \otimes x_j^{(k)}$$ has rank $r$.

The second one is the proposition 4.6, which I partially reproduce below. This answers the second question negatively.

Proposition 4.6. Let $x_1, y_1 \in \mathbb{R}^{d_1}, x_2,y_2 \in \mathbb{R}^{d_2}$, and $x_3, y_3 \in \mathbb{R}^{d_3}$ be vectors such that each pair $x_i,y_i$ is linearly independent. Then the tensor $$x_1 \otimes x_2 \otimes y_3 + x_1 \otimes y_2 \otimes x_3 + y_1 \otimes x_2 \otimes x_3$$ has rank 3.

The tensor has rank 3 but in the decomposition we use $x_1$ as the first factor two times, so it's not necessary to have linearly independent factors in $\mathbb{R}^{d_1}$. The same goes for the other positions.