About part e), you know both the trace and the determinant are coefficients of the characteristic polynomial of $L$, which in this case has degree $3= \dim V$. There is one coefficient you have not used yet, and perhaps this is the bilinear invariant you are looking for.
These questions can be answered in the case of real spaces, which is enough for my purposes. Anyone interested can check the article Tensor Rank and the Ill-Posedness of the Best Low-Rank Approximation Problem - Vin de Silva and Lek-Heng Lim.
The first one is the lemma 3.5. This answers the first question positively.
Lemma 3.5. For $\ell=1 \ldots k$, let $x_1^{(\ell)}, \ldots, x_r^{(\ell)} \in \mathbb{R}^{d_\ell}$ be linearly independent. Then the tensor defined by
$$ \sum_{j=1}^r x_j^{(1)} \otimes x_j^{(2)} \otimes \ldots \otimes x_j^{(k)}$$
has rank $r$.
The second one is the proposition 4.6, which I partially reproduce below. This answers the second question negatively.
Proposition 4.6. Let $x_1, y_1 \in \mathbb{R}^{d_1}, x_2,y_2 \in \mathbb{R}^{d_2}$, and $x_3, y_3 \in \mathbb{R}^{d_3}$ be vectors such that each pair $x_i,y_i$ is linearly independent. Then the tensor
$$x_1 \otimes x_2 \otimes y_3 + x_1 \otimes y_2 \otimes x_3 + y_1 \otimes x_2 \otimes x_3$$
has rank 3.
The tensor has rank 3 but in the decomposition we use $x_1$ as the first factor two times, so it's not necessary to have linearly independent factors in $\mathbb{R}^{d_1}$. The same goes for the other positions.
Best Answer
Let's simplify this a bit and take a $(1,1)$ tensor over a real vector space $V$. We might take this, by definition, to be a bilinear map $V\times V^* \to \mathbb R.$ So we can think of it as taking a vector and a covector argument and looking like $T(v,v^*) = a.$
However, we can also think of it as a linear map $V\to V,$ or as a linear map $V^*\to V^*.$ To see how this works, take $v^*\in V^*$ fixed, and consider $T(v,v^*)$ as a function of $v.$ This is a linear map that takes a vector $v$ and produces a real number. In other words, it is a covector. So we can also define the tensor as the map which takes a covector $v^*$ and returns the covector $T(\_,v^*)$ we just just discussed. This is a way of viewing it as a linear map $V^*\to V^*.$ Similarly we can view it as $v\mapsto T(v,\_)$ a map from $V\to V.$ (Recall that a linear map that takes a covector and produces a vector is just a vector).
This reasoning works out on higher ranks as well. An $(n,m)$ tensor can be looked at a multilinear map $V^m\times (V^*)^n \to \mathbb R,$ or we can think of it as a map that takes a vector (or another tensor) and contracts it with the tensor to produce a lower-rank tensor. For instance if you give an $(n,m)$ tensor a vector, it will act linearly on that vector and return an $(n,m-1)$ tensor. In other words, it's equally well viewed as a linear map $$ V\to V^{\otimes (m-1)}\otimes V^{\otimes n}$$