Your proof looks correct to me, and I don't think you said anything else wrong either.
I prefer not to think of $T(V)$ in terms of sequences like this. For me, an element of $T(V)$ is a linear combination of tensors of possibly mixed degree; to convert a sequence into this, just add up the terms, which you can do because all but finitely many are zero.
Then the tensor product can essentially be defined to be distributive. You already know how to take the tensor product of two pure tensors, so to take the product of two combinations, just expand by the distributive law and then use the definition for pure tensors.
As an analogy, this is kind of like thinking of the polynomial ring $\mathbb{C}[x]$ as:
$$\mathbb{C}[x]=\bigoplus_{i=0}^\infty\mathbb{C}\langle x^i\rangle$$
where $\mathbb{C}\langle x^i\rangle$ is the one-dimensional vector space spanned by $x^i$. Then we can define $x^i\cdot x^j=x^{i+j}$ and extend this product to $\mathbb{C}[x]$ uniquely by insisting it is multilinear.
That's all a question of indexing. Recall that $\mathbb C^{n,n}$ is just a vector space of dimension $n \cdot n = n^2.$ So $v_i\otimes w_j$ can very well be multiplied with a matrix from $\mathbb C^{n^2,n^2}.$
More technically, it works as follows. Let $B := \{e_j\ |\ 1 \leq j \leq n \}$ be a basis of $\mathbb C^n,$ the standard basis, if you want. Then $C:= \{e_j\otimes e_k\ |\ 1 \leq j,k \leq n \}$ is a basis of $\mathbb C\otimes\mathbb C.$ Then the general element $x$ of $\mathbb C\otimes\mathbb C$ can be written as $x = \sum_{j,k}\xi_{jk}e_j\otimes e_k,$ so its coefficient vector relative to the basis $C$ is $(\xi_{jk}).$ Note that we have two indices, not just one. You can view it as a matrix, as you remarked above.
Now, let $E_{jk} \in \mathbb C^{n,n}$ be the matrix with $1$ at the intersection of the $j$-th row and the $k$-th column and $0$ elsewhere. Then $D:= \{E_{jk}\ |\ 1\leq j,k\leq n\}$ is a basis of $\mathbb C^{n,n}.$ As above, $E:= \{E_{jk}\otimes E_{pq}\ |\ 1\leq j,k,p,q\leq n\}$ is a basis of $\mathbb C^{n,n}\otimes \mathbb C^{n,n}.$ The general element $M$ of $\mathbb C^{n,n}\otimes \mathbb C^{n,n}$ can be written as $M = \sum_{j,k,p,q}\mu_{jkpq}E_{jk}\otimes E_{pq}$ and has coefficient "matrix" $(\mu_{jkpq}).$ Here we have four indeces instead of the usual two.
With all this notation, the coefficient vector of $Mx$ is
$$
\left(\sum_{k,q}\mu_{jkpq} \xi_{kq}\right)
$$
with free indices $j$ and $p$. This is very similar to the "usual" matrix-vector-multiplication, except that each dimension has a pair of indices, not just one index. However, note the "intertwining" of indices. The indices of $\xi$ are connected to the second and fourth indices of $\mu,$ not the third and fourth.
Finally, let's consider the Kronecker products of $S = (\sigma_{jk}) \in \mathbb C^{n,n},$ $T = (\tau_{pq}) \in \mathbb C^{n,n},$ $x = (\xi_k) \in \mathbb C^n,$ and $y = (\eta_q) \in \mathbb C^n.$ We have
$$
S\otimes T = \left(\sigma_{jk}\tau_{pq}\right) \in \mathbb C^{n,n}\otimes \mathbb C^{n,n}
$$
and
$$
x\otimes y = \left(\xi_k\eta_q\right) \in \mathbb C^n \otimes \mathbb C^n.
$$
From this, we get
$$
\begin{align}
(S\otimes T)(x\otimes y) & = \left(\sum_{k,q}\sigma_{jk}\tau_{pq}\xi_k\eta_q\right) \\
& = \left(\left(\sum_k\sigma_{jk}\xi_k\right)\left(\sum_q\tau_{pq}\eta_q\right)\right) \\
& = (Sx)\otimes(Ty) \in \mathbb C^n \otimes \mathbb C^n.
\end{align}
$$
We find that everything fits nicely together.
Best Answer
Fix a basis $\{e_1, \ldots, e_n\}$ of $V$, and consider the dual basis $\{f_1, \ldots, f_n \}$ of $V^\ast$. Then we have a basis $$\{e_1\otimes f_1,\ldots, e_i \otimes f_j, \ldots, e_n \otimes f_n\}$$ for $V \otimes V^\ast$, and the matrix $$A = (a_{ij})$$ is just a way of representing the element $$\sum_{i=1}^n \sum_{j=1}^n a_{ij} \; e_i \otimes f_j \in V \otimes V^\ast.$$
Of course an element of $V \otimes V^\ast$ gives a linear map $V \to V$ by
$$(w \otimes f)(v) := f(v) w$$
and extending by linearity. Given two such elements, we can compose the corresponding functions:
$$(w' \otimes f')(w \otimes f)(v) = (w' \otimes f')(f(v) w) = f(v) f'(w) w' = f'(w) \; (w' \otimes f)(v)$$
so composition of linear maps is given by
$$(w' \otimes f') \circ (w \otimes f) = f'(w) \; (w' \otimes f)$$
extended by linearity. If you write your elements in the $e_i \otimes f_j$ basis and apply this operation to them, you'll see that the usual definition of matrix multiplication pops right out.
Of course all the calculations with explicit tensors above can be rephrased in terms of the universal property of the tensor product if you like.
This is all assuming you want the matrix to represent an element of $V \otimes V^\ast$ rather than an element of $V \otimes V$ or $V^\ast \otimes V^\ast$. But you can work out what should happen in cases like that the same way.