I am guessing that your question is: What is $\bigwedge^k (V \oplus W)$? (This is the wedge product, not the tensor product.)
And also the same question for $S^k = Sym^k$, which is the symmetric product.
Extended hint:
The thing to observe is that there is a natural basis for $V \oplus W$. Namely, you take a basis for $V$ and union it with a basis for $W$.
Let's call our natural basis for $V$ $\{v_1, \ldots, v_n\}$, and our basis for $W$ $\{w_1, \ldots, w_m\}$.
Now, given a basis for a vector space $X$, there is a natural basis for $\bigwedge^k X$ and for $Sym^k X$:
Suppose that $\{x_1, \ldots, x_r\}$ is a basis for $X$.
1) Then a basis for $\bigwedge^k X$ is given by all of the $\{x_{i_1} \wedge \ldots \wedge x_{i_k} \}$ for all $i_1 < \ldots < i_k$, $i_j \in \{1, 2, \ldots r\}$.
2) A natural basis for $Sym$ is given similarly, only now you are allowed to take repeated vectors (so $<$ will be replaced by $\leq$) - another description is the set of monomials of degree k in $\mathbb{Z}[x_1, \ldots, x_r]$.
Finally: Given that $\{x_1, \ldots, x_r\} = \{v_1, \ldots, v_n, w_1, \ldots, w_m\}$ is a basis for $V \oplus W = X$, you can now play a combinatorial game to divide up $\bigwedge^k X$ into direct sums of smaller exterior powers of $V$ and $W$. Similarly for the symmetric product. Do you see how to proceed? Please feel free to ask if you have questions.
These questions can be answered in the case of real spaces, which is enough for my purposes. Anyone interested can check the article Tensor Rank and the Ill-Posedness of the Best Low-Rank Approximation Problem - Vin de Silva and Lek-Heng Lim.
The first one is the lemma 3.5. This answers the first question positively.
Lemma 3.5. For $\ell=1 \ldots k$, let $x_1^{(\ell)}, \ldots, x_r^{(\ell)} \in \mathbb{R}^{d_\ell}$ be linearly independent. Then the tensor defined by
$$ \sum_{j=1}^r x_j^{(1)} \otimes x_j^{(2)} \otimes \ldots \otimes x_j^{(k)}$$
has rank $r$.
The second one is the proposition 4.6, which I partially reproduce below. This answers the second question negatively.
Proposition 4.6. Let $x_1, y_1 \in \mathbb{R}^{d_1}, x_2,y_2 \in \mathbb{R}^{d_2}$, and $x_3, y_3 \in \mathbb{R}^{d_3}$ be vectors such that each pair $x_i,y_i$ is linearly independent. Then the tensor
$$x_1 \otimes x_2 \otimes y_3 + x_1 \otimes y_2 \otimes x_3 + y_1 \otimes x_2 \otimes x_3$$
has rank 3.
The tensor has rank 3 but in the decomposition we use $x_1$ as the first factor two times, so it's not necessary to have linearly independent factors in $\mathbb{R}^{d_1}$. The same goes for the other positions.
Best Answer
For the purposes of the following, I identify $\sum_{ijk}x_{ijk} v_{i} \otimes v_j \otimes v_k$ with a 3-dimensional array $X$. For the purposes of your question over a 2-dimensional $V$, we have $$ X = [X_1 |X_2] = \left[ \begin{array}{cc|cc} x_{111} & x_{121} & x_{112} & x_{122}\\ x_{211} & x_{221} & x_{212} & x_{222} \end{array}\right]. $$
Question 1:
For your problem, we have $$ X_1 = \pmatrix{1&0\\0&0}, \quad X_2 = \pmatrix{0&1\\1&0}. $$ We will simply apply the result explained below. Since exchanging slices of a tensor will not change its rank, we'll exchange the roles of $X_1$ and $X_2$ since $X_2$ is the invertible slice. We find that $$ X_1X_2^{-1} = \pmatrix{0&1\\0&0}. $$ Since this matrix fails to be diagonalizable, $X$ must be a tensor of rank $3$.
Question 2:
Your second problem can be approached similarly. We now have $$ X_1 = \pmatrix{1&0\\0&-1},\quad X_2 = \pmatrix{0&1\\1&0}. $$ Applying the result below, we find that $$ X_2 X_1^{-1} = \pmatrix{0&-1\\1&0}. $$ Because this matrix is diagonalizable with strictly complex eigenvalues, we can conclude that $X$ has a rank of at least $3$ if we restrict ourselves to real coefficients, and a rank of $2$ if we allow complex coefficients.
It remains to be shown, however, that the rank of $X$ over $\Bbb R$ is not more than $3$. To do this, simply observe that we can get this tensor by adding a rank-1 tensor to the tensor given in the first question.
Regarding the presentation of $t$ as a rank-2 tensor: if we follow the construction from the proof I present below, then we note that $X_2 X_1^{-1} = K\Lambda K^{-1}$, where $$ \Lambda = \pmatrix{i\\&-i}, \quad K = \pmatrix{i&1\\1&i}. $$ So, we find that $$ X = a_1 \otimes b_1 \otimes c_1 + a_2 \otimes b_2 \otimes c_2 $$ where $a_1,a_2$ are the columns of $K$, $b_1,b_2$ are the rows of $K^{-1}X_1$, and we have $c_1 = (1,i), c_2 = (1,-i)$.
Now, here is an adaptation of the statement and proof of Lemma 1 of this paper.
Proof: First, note that since $X_1$ is invertible, the rank of $X$ must be at least $p$.
Proof of i: Now, suppose that we have $X_2 X_1^{-1} = K \Lambda K^{-1}$, where $\Lambda = \operatorname{diag}(\lambda_1,\dots,\lambda_p)$. If we take $$ A = K, \quad B^T = K^{-1}X_1, \quad C_1 = I_p, \quad C_2 = \Lambda, $$ then we find that $$ X_1 = AC_1B^T, \quad X_2 = AC_2B^T. $$ This corresponds to a rank-$p$ decomposition of the matrix $X$. In particular: if we take $a_j$ to denote the $j$th column of $A$ and $b_j$ to denote the $j$th column of $B$, then we have $$ X_1 = AC_1B^T = \sum_{j=1}^p c_{1,i} \, a_ib_i^T, \quad X_2 = AC_2B^T = \sum_{j=1}^p c_{2,i} \, a_i b_i^T. $$ Correspondingly, we have $$ X = \left(\sum_{j=1}^p c_{1,j} \, a_j \otimes b_j\right) \otimes e_1 + \left(\sum_{j=1}^p c_{2,j} \, a_j \otimes b_j\right) \otimes e_2\\ = \sum_{j=1}^p a_j \otimes b_j \otimes (c_{1,j} e_1) + \sum_{j=1}^p a_j \otimes b_j \otimes (c_{2,j}e_2)\\ = \sum_{j=1}^p a_j \otimes b_j \otimes (c_{1,j}e_1 + c_{2,j}e_2). $$ In the above, $e_1 = (1,0)$ and $e_2 = (0,1)$.
Proof of ii and iii: It suffices to prove that if $X$ is a rank-$p$ tensor and $X_1$ is invertible, then $X_2X_1^{-1}$ must be diagonalizable. Indeed, if $X$ is a rank-$p$ tensor, then we can take $$ X_1 = AC_1B^T, \quad X_2 = AC_2B^T $$ by reversing the above sequence of equations. It follows that $$ X_2X_1^{-1} = (AC_2B^T)(AC_1B^T)^{-1} = AC_2 B^T B^{-T} C_1^{-1} A^{-1} = A (C_2 C_1^{-1})A^{-1}. $$ So, $X_2X_1^{-1}$ is indeed diagonalizable (and diagonalizable over $\Bbb R$ when $A,B,C$ are real). The conclusion follows.