Yes it is true independent of the cardinality of the bases of the vector spaces. Use the universal property: Let $T$ be any vector space over the given field.
A bilinear map $V \times W \to T$ is uniquely determined by the images of the pairs $(v_i,w_j)_{i \in I,j \in J}$, so we get
$$Bil(V \times W, T) = Abb(\{(v_i,w_j), i \in I, j \in J\},T)= Abb(\{v_i \otimes w_j, i \in I, j \in J\},T)$$
By the universal property $V \otimes W$ is the vector-space with the property
$$Hom(V \otimes W,T) = Bil(V \times W, T)$$
, so we obtain
$$Hom(V \otimes W,T) = Abb(\{v_i \otimes w_j, i \in I, j \in J\},T)$$
, which precisely states that $\{v_i \otimes w_j, i \in I, j \in J\}$ is a basis of $V \otimes W$.
Let us precise this:
Given a map in $f \in Abb(\{v_i \otimes w_j, i \in I, j \in J\},T)$, which means that we are given $f(v_i \otimes w_j)$ for all $i \in I, j \in J$, we have to show that this extends uniquely to a map $F:V \otimes W \to T$.
We define the bilinear map $\hat f: V \times W \to T$ by $\hat f(v_i,w_j) := f(v_i \otimes w_j)$. By the universal property, this gives us a unique map $F:V \otimes W \to T, F(v \otimes w)=\hat f(v_i,w_j)=f(v_i \otimes w_j)$. Hence this is the desired unique map.
The uniqueness of $F$ is trivial anyway, because $\{v_i \otimes w_j, i \in I, j \in J\}$ is clearly a set of generators. We only need to show the existence of $F$ here (And this corresponds to the linear independence).
Thanks for the hint in comments. It's more clear now what the answer would be:
Applied to the case in the QUESTION, the change of basis matrix is $\small\begin{bmatrix}3&4&-1\\0&3&7\\1&3&0.5\end{bmatrix}$, and its inverse $\small\begin{bmatrix}0.7&0.2&-1.1\\-0.3&-0.1&0.8\\0.1&0.2&-0.3\end{bmatrix}$. The vectors $v$ and $w$ in the new coordinate system are
$v =\small\begin{bmatrix}0.7&0.2&-1.1\\-0.3&-0.1&0.8\\0.1&0.2&-0.3\end{bmatrix}\begin{bmatrix}1\\2\\3\end{bmatrix} =\begin{bmatrix}-2.3\\1.9\\-0.5\end{bmatrix}$ and $w=\small\begin{bmatrix}0.7&0.2&-1.1\\-0.3&-0.1&0.8\\0.1&0.2&-0.3\end{bmatrix}\begin{bmatrix}1\\0\\0\end{bmatrix}=\begin{bmatrix}0.7\\-0.3\\0.1\end{bmatrix}$.
Therefore,
$$\begin{align}\large v\otimes w=\left(-.23\tilde x + 1.9\tilde y -0.5 \tilde z\right)\otimes \left(0.7\tilde x -0.3\tilde y + 0.1\tilde z\right)\\[2ex]=-1.6\;\tilde x\otimes \tilde x + 1.3\;\tilde x\otimes \tilde y -0.3 \;\tilde x\otimes \tilde z + 0.6\;\tilde y\otimes \tilde x -0.5\;\tilde y\otimes \tilde y+ 0.1\;\tilde y\otimes \tilde z -0.3\;\tilde z\otimes \tilde x +0.2 \;\tilde z\otimes \tilde y-0.1\;\tilde z\otimes \tilde z\end{align}$$
So what's the point?
Starting off defining the tensor product of two vector spaces ($V\otimes W$) with the same bases, we end up calculating the outer product of two vectors:
$$\large v\otimes_o w=\small \begin{bmatrix}-2.3\\1.9\\-0.5\end{bmatrix}\begin{bmatrix}0.7&-0.3&0.1\end{bmatrix}=\begin{bmatrix}-1.61&0.69&-0.23\\1.33&-0.57&0.19\\-0.35&0.15&-0.05\end{bmatrix}$$
This connect this post to this more general question.
Best Answer
This is a response to the OP's comment, which is too long for a comment.
Conventions
The two objects denoted by $\oplus$ are sometimes called the "internal" and "external" direct sums. External direct sums always make sense for any two vector spaces, and elements are literally ordered pairs whose elements come from the summands. Internal direct sums require that the summands both live in a common ambient vector space, and their elements are literally vectors in that ambient space. In case the internal sum makes sense, one can prove that the map $V\oplus_{ext} W \to V\oplus_{int} W$ sending $(v,w)$ to $v+w$ is an isomorphism (and is the "best" kind of isomorphism in any sense you might mean that, e.g. functorial), so they are for all intents and purposes the same object.
In vector space decompositions such as yours, it is extremely common to use the symbol "=" to mean "isomorphic (in the best needed way)". This abuse of notation is very well-justified in practice, e.g. I may want to construct the tensor product as a set of matrices instead of writing down an abstract basis as you've done, and it's silly to let this "linguistic" difference get in the way.
But for the purposes of this question, it's clear that you mean we should both agree that $V\otimes W$ means $\text{span}_{\Bbb R} \{v\otimes w:v\in V, w\in W\}$, and that you mean "=" to mean "literally equal as sets". In this case, we must use the internal direct sum, since the left-hand side is not constructed set-theoretically as a direct sum (unless we have a very strange construction of the external direct sum).
Since you have constructed the ambient vector space $V_{AB}$, in which both $V_I$ and $V_{I\!I}$ live, this is not a problem. We simply need to find two subspaces of $V_{AB}$ with trivial intersection that span the space.
Construction
Literally speaking, $$ V_{AB} = \left\{ a\left(\begin{bmatrix}1\\0\\0\end{bmatrix}\otimes\begin{bmatrix}0\\1\\0\end{bmatrix}\right) + b\left(\begin{bmatrix}1\\0\\0\end{bmatrix}\otimes \begin{bmatrix}0\\0\\1\end{bmatrix}\right) : a,b\in \Bbb{R}\right\}$$
Thus, one possible choice for $V_I$ and $V_{I\!I}$ would be $$ V_{I} = \left\{ a\left(\begin{bmatrix}1\\0\\0\end{bmatrix}\otimes\begin{bmatrix}0\\1\\0\end{bmatrix}\right) : a\in \Bbb{R}\right\}$$ $$ V_{I\!I} = \left\{ b\left(\begin{bmatrix}1\\0\\0\end{bmatrix}\otimes\begin{bmatrix}0\\0\\1\end{bmatrix}\right) : b\in \Bbb{R}\right\}.$$
The natural bases for these spaces are the obvious ones: just remove the remove the coefficients.
This is of course not the only choice*, but to address the other question in your comment, it is not even necessary that $V_I$ has dimension 1. It could just as easily be the zero subspace or the full $V_{AB}$ (leaving $V_{I\!I}$ to be the other one). However, because this is "boring", it is sometimes called the trivial direct sum decomposition. So in that sense, the answer to your question is yes: in your example, all nontrivial decompositions will have both summands of dimension 1.
* I say "of course" in the sense that there is the usual freedom that one has in (direct) sum constructions. For instance, a different choice would be $V_{I\!I}$ as before, but $$ V_{I} = \left\{ 3a\left(\begin{bmatrix}1\\0\\0\end{bmatrix}\otimes\begin{bmatrix}0\\1\\0\end{bmatrix}\right) - 2a\left(\begin{bmatrix}1\\0\\0\end{bmatrix}\otimes\begin{bmatrix}0\\0\\1\end{bmatrix}\right) : a\in \Bbb{R}\right\},$$ and other such things.