If $V$ and $W$ are vector spaces, you can form a third vector space from them called their tensor product $V \otimes W$. The tensor product consists of sums of certain vectors called "pure tensors," which are written $v \otimes w$ where $v \in V, w \in W$, subject to certain rules, e.g. $(v_1 + v_2) \otimes w = v_1 \otimes w + v_2 \otimes w$. For a complete list of these rules see Wikipedia. In practice you'll do fine if you remember the following:
If $v_1, \dots v_n$ is a basis of $V$ and $w_1, \dots w_m$ is a basis of $W$, then the pure tensors $v_i \otimes w_j, 1 \le i \le n, 1 \le j \le m$ form a basis of $V \otimes W$. In particular, $\dim V \otimes W = \dim V \times \dim W$.
If $T : V_1 \to V_2$ and $S : W_1 \to W_2$ are two linear maps, you can form a third linear map from them which is also called their tensor product
$$T \otimes S : V_1 \otimes W_1 \to V_2 \otimes W_2.$$
It is completely determined by how it behaves on pure tensors, which is
$$(T \otimes S)(v \otimes w) = T(v) \otimes S(w).$$
The relationship between these two uses of the term "tensor product" is given formally by the notion of a functor.
Tensor product notation for linear maps is compatible with the notation $v \otimes w$ for pure tensors in the following sense. A vector $v \in V$ in a vector space is the same thing as a linear map $v : 1 \to V$ from the one-dimensional vector space $1$ given by the underlying field to $V$, and if $v : 1 \to V$ and $w : 1 \to W$ are two vectors in $V, W$, then their tensor product as linear maps $v \otimes w : 1 \otimes 1 \to V \otimes W$ corresponds to the pure tensor $v \otimes w$, where we use that there's a canonical isomorphism $1 \otimes 1 \cong 1$.
The Kronecker product is a description of the tensor product of linear maps with respect to a choice of basis for all of the vector spaces involved. Formally, with notation as above, if
- $B_1, B_2$ are bases for $V_1, V_2$,
- $C_1, C_2$ are bases for $W_1, W_2$,
- given bases $B_i, C_i$ of $V_i, W_i$, we write $B_i \otimes C_i$ for the corresponding basis of $V_i \otimes W_i$ as in the highlighted area above, and
- we write $_{B_2}[T]_{B_1}$ to refer to the matrix of a linear transformation $T : V_1 \to V_2$ with respect to a basis $B_1$ of $V_1$ and a basis $B_2$ of $V_2$,
then we have
$$_{B_2 \otimes C_2}[T \otimes S]_{B_1 \otimes C_1} = \, _{B_2}[T]_{B_1} \otimes \, _{C_2}[S]_{C_1}$$
where on the LHS $\otimes$ means the tensor product of linear maps and on the RHS $\otimes$ means the Kronecker product.
One final remark: the definition of spaces of tensors you give in 2) is a terrible definition that I've only seen in some textbooks on differential geometry. It is absolutely the wrong way to think about tensors.
as columns
$$
\left(
\begin{array}{rrrr}
1&-1&-1&-1 \\
1& 1&-1&-1 \\
1&0 &2&-1 \\
1&0&0&3
\end{array}
\right)
$$
Pattern, done correctly, works in any dimension
$$
\left( \begin{array}{rrrrrrrrrr}
1 & -1 & -1 & -1 & -1 & -1 & -1 & -1 & -1 & -1 \\
1 & 1 & -1 & -1 & -1 & -1 & -1 & -1 & -1 & -1 \\
1 & 0 & 2 & -1 & -1 & -1 & -1 & -1 & -1 & -1 \\
1 & 0 & 0 & 3 & -1 & -1 & -1 & -1 & -1 & -1 \\
1 & 0 & 0 & 0 & 4 & -1 & -1 & -1 & -1 & -1 \\
1 & 0 & 0 & 0 & 0 & 5 & -1 & -1 & -1 & -1 \\
1 & 0 & 0 & 0 & 0 & 0 & 6 & -1 & -1 & -1 \\
1 & 0 & 0 & 0 & 0 & 0 & 0 & 7 & -1 & -1 \\
1 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 8 & -1 \\
1 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 9
\end{array}
\right).
$$
Best Answer
The trace of a type $(1,1)$ tensor $T^i_j\varepsilon^j\otimes e_i$ is by definition the contraction $T^i_i$. I interpret the question as "Show that $T^i_i$ is a scalar". This is also true by definition since the quantities obtained by contraction of a type $(r,s)$-tensor constitute the components of a tensor of type $(r-1,s-1)$. But we can at least verify it for clarity.
To verify that the trace is a scalar we can use the trick "If it looks like a tensor, swims like a tensor, and quacks like a tensor, then it is a tensor."
The components of a type $(1,1)$-tensor transform like
$$\bar{T}^u_w=\frac{\partial \bar{x}^u}{\partial x^i}\frac{\partial x^j}{\partial \bar{x}^w}T^i_j$$ The trace $\bar{T}^u_u$ is therefore (let $w=u$) $$\bar{T}^u_u=\frac{\partial \bar{x}^u}{\partial x^i}\frac{\partial x^j}{\partial \bar{x}^u}T^i_j=\delta^j_iT^i_j=T^i_i$$ So the trace is indeed a $(0,0)$-tensor, ie a scalar. Sometimes referred to as an invariant (it is invariant with respect to a change of basis). Notice that this concept of a scalar might be different from what you are used to in linear algebra where pretty much any real number is considered a "scalar".
Now with the components $T^i_j=u^iv_j$ in 3 dimensions obviously $T^k_k=u^1v_1+u^2 v_2+u^3v_3$ which artificially might be interpreted as $\mathbf{u}\cdot \mathbf{v}=g_{ij}u^iv^j$.