Tensor rank decomposition for some vectors

linear algebratensor-productstensor-rank

I have a problem with solving of following problem.
We have a vector space $V$ over field $k$ with basis $v_1, v_2$
Rank of a vector $v$ of $V \otimes V \otimes V$ is minimum length of decomposition of $v$ into linear combination of rank-1 vectors
I need to solve the following:

  • Prove that vector $v = v_1 \otimes v_1 \otimes v_1 + v_1 \otimes v_2 \otimes v_2 + v_2 \otimes v_1 \otimes v_2$ has rank $3$

  • Vector $t = v_1 \otimes v_1 \otimes v_1 – v_2 \otimes v_2 \otimes v_1 + v_1 \otimes v_2 \otimes v_2 + v_2 \otimes v_1 \otimes v_2$. Show decomposition of $t$ into sum of two rank-1 vectors when $k = \Bbb C$. Prove that tensor rank of $t$ is $3$ when $k = \Bbb R$

Tried to look for an information but found nothing usefull. I thought about supposing that we can decompose $v$ into $e_{11} \otimes e_{12} \otimes e_{13} + e_{21} \otimes e_{22} \otimes e_{23}$ and supposing that $e_{ij}$ form basis for each $j$ that allows us to rewrite $v$ in terms of $e_{ij}$. After doing this i got a large system with a large amount of variables that i'm unable to work with. I have been thinking about this problem for a several weeks already and do not know how to even think about it.

Best Answer

For the purposes of the following, I identify $\sum_{ijk}x_{ijk} v_{i} \otimes v_j \otimes v_k$ with a 3-dimensional array $X$. For the purposes of your question over a 2-dimensional $V$, we have $$ X = [X_1 |X_2] = \left[ \begin{array}{cc|cc} x_{111} & x_{121} & x_{112} & x_{122}\\ x_{211} & x_{221} & x_{212} & x_{222} \end{array}\right]. $$


Question 1:

For your problem, we have $$ X_1 = \pmatrix{1&0\\0&0}, \quad X_2 = \pmatrix{0&1\\1&0}. $$ We will simply apply the result explained below. Since exchanging slices of a tensor will not change its rank, we'll exchange the roles of $X_1$ and $X_2$ since $X_2$ is the invertible slice. We find that $$ X_1X_2^{-1} = \pmatrix{0&1\\0&0}. $$ Since this matrix fails to be diagonalizable, $X$ must be a tensor of rank $3$.


Question 2:

Your second problem can be approached similarly. We now have $$ X_1 = \pmatrix{1&0\\0&-1},\quad X_2 = \pmatrix{0&1\\1&0}. $$ Applying the result below, we find that $$ X_2 X_1^{-1} = \pmatrix{0&-1\\1&0}. $$ Because this matrix is diagonalizable with strictly complex eigenvalues, we can conclude that $X$ has a rank of at least $3$ if we restrict ourselves to real coefficients, and a rank of $2$ if we allow complex coefficients.

It remains to be shown, however, that the rank of $X$ over $\Bbb R$ is not more than $3$. To do this, simply observe that we can get this tensor by adding a rank-1 tensor to the tensor given in the first question.

Regarding the presentation of $t$ as a rank-2 tensor: if we follow the construction from the proof I present below, then we note that $X_2 X_1^{-1} = K\Lambda K^{-1}$, where $$ \Lambda = \pmatrix{i\\&-i}, \quad K = \pmatrix{i&1\\1&i}. $$ So, we find that $$ X = a_1 \otimes b_1 \otimes c_1 + a_2 \otimes b_2 \otimes c_2 $$ where $a_1,a_2$ are the columns of $K$, $b_1,b_2$ are the rows of $K^{-1}X_1$, and we have $c_1 = (1,i), c_2 = (1,-i)$.


Now, here is an adaptation of the statement and proof of Lemma 1 of this paper.

Claim: Let $X$ be a real-valued $p \times p \times 2$ array with $p \times p$ slices $X_1$ and $X_2$. Suppose that $X_{1}^{-1}$ exists. The following statements hold:

  • If $X_2X_1^{-1}$ has $p$ real eigenvalues and is diagonalizable, then $X$ has rank $p$ over $\Bbb R$
  • If $X_2X_1^{-1}$ has at least one pair of complex eigenvalues, then $X$ has rank $p$ over $\Bbb C$ and rank at least $p+1$ over $\Bbb R$
  • If $X_2X_1^{-1}$ is not diagonalizable, then $X$ has rank at least $p+1$ over $\Bbb C$.

Proof: First, note that since $X_1$ is invertible, the rank of $X$ must be at least $p$.

Proof of i: Now, suppose that we have $X_2 X_1^{-1} = K \Lambda K^{-1}$, where $\Lambda = \operatorname{diag}(\lambda_1,\dots,\lambda_p)$. If we take $$ A = K, \quad B^T = K^{-1}X_1, \quad C_1 = I_p, \quad C_2 = \Lambda, $$ then we find that $$ X_1 = AC_1B^T, \quad X_2 = AC_2B^T. $$ This corresponds to a rank-$p$ decomposition of the matrix $X$. In particular: if we take $a_j$ to denote the $j$th column of $A$ and $b_j$ to denote the $j$th column of $B$, then we have $$ X_1 = AC_1B^T = \sum_{j=1}^p c_{1,i} \, a_ib_i^T, \quad X_2 = AC_2B^T = \sum_{j=1}^p c_{2,i} \, a_i b_i^T. $$ Correspondingly, we have $$ X = \left(\sum_{j=1}^p c_{1,j} \, a_j \otimes b_j\right) \otimes e_1 + \left(\sum_{j=1}^p c_{2,j} \, a_j \otimes b_j\right) \otimes e_2\\ = \sum_{j=1}^p a_j \otimes b_j \otimes (c_{1,j} e_1) + \sum_{j=1}^p a_j \otimes b_j \otimes (c_{2,j}e_2)\\ = \sum_{j=1}^p a_j \otimes b_j \otimes (c_{1,j}e_1 + c_{2,j}e_2). $$ In the above, $e_1 = (1,0)$ and $e_2 = (0,1)$.

Proof of ii and iii: It suffices to prove that if $X$ is a rank-$p$ tensor and $X_1$ is invertible, then $X_2X_1^{-1}$ must be diagonalizable. Indeed, if $X$ is a rank-$p$ tensor, then we can take $$ X_1 = AC_1B^T, \quad X_2 = AC_2B^T $$ by reversing the above sequence of equations. It follows that $$ X_2X_1^{-1} = (AC_2B^T)(AC_1B^T)^{-1} = AC_2 B^T B^{-T} C_1^{-1} A^{-1} = A (C_2 C_1^{-1})A^{-1}. $$ So, $X_2X_1^{-1}$ is indeed diagonalizable (and diagonalizable over $\Bbb R$ when $A,B,C$ are real). The conclusion follows.

Related Question