Regarding the Background:
The first two bullets are fine. The "copy paste" metaphor is interesting.
Bullet 3: I'm not quite sure what you're getting at with this excerpt
Any number of different matrices could represent the same vector, for instance the same 1 by 3 column vector could be represented as a 2 by 2 matrix with one slot being 0 and the elements switched around. However, you usually choose the representation that makes computations appear as they would for basic linear algebra with geometric vectors.
but of course the gist, i.e. that row/column vectors and matrices can be used to flexibly represent vector spaces, is correct.
Bullet 4: Really not sure what you're trying to get here. I'm not sure how to interpret the sentence "multiplication of b by a can loosely be interpreted as a as a function of b".
Bullet 5: Not sure what this is supposed to say. I think you're just explaining the notation $f:X \to Y$, but your wording and choices of notation are awkward.
Bullet 6: Mostly correct, but subtly wrong. The phrasing of "vector spaces imply the existence of elements of a field with it" bothers me. It's not clear what you mean by "these maps must exist in order for it to have been called a vector space". In the end, it seems like you're trying to say something like "The dual space $V^*$ of $V$ is the set of linear maps from $V$ to its underlying field. The elements of a dual space are called covectors". I think you have the right idea, it's just not very readable right now.
Regarding Tensors:
Bullet 1: I haven't heard "tense" used as a verb in this sense. The sentence Additionally, since the linear transformations from the tensor on each vector can be encoded a vector, tensors should also be able to be vectors, which means they have to be able to be part of a vector space is unclear.
Bullet 2: "A multilinear transformation contains multiple sets of linear transformation information, each of which can be considered a vector": Not clear what "contain" means here. I'm really not sure what exactly you're trying to convey in the rest of this paragraph.
Bullet 3: A tensor is defined as an element of the tensor product of any number of vector spaces. Otherwise fine.
Your last two bullets are fine.
What makes all of this really confusing is that in some contexts, it is convenient to think of tensors as multilinear maps, while in other contexts it is convenient to think of tensors as being elements of the fancy vector space that we call the "tensor product" of the input spaces. It is common in the context in the exposition of the relevant fields to completely ignore the alternate points of view.
I have found that in differential geometry, the multilinear map point of view is more common. I think that the "multidimensional array" point of view is most directly connected to this multilinear map definition of a tensor product.
The advantage of the more abstract definition via tensor products of spaces is that all of the maps that we care about are simply linear maps (or in the greater algebraic context, module homomorphisms).
Given any two vector spaces $U,V$ over the same field $\Bbb{F}$, we can define their tensor product $U\otimes V$ via the universal property you mention in the first paragraph. In the finite-dimensional case, we have the following canonical isomorphism:
\begin{align}
U\otimes V \cong (U^{**})\otimes (V^{**})\cong (U^*\otimes V^*)^*\cong \text{Hom}^2(U^*\times V^*;\Bbb{F}),
\end{align}
where this last space is the space of bilinear maps $U^*\times V^*\to \Bbb{F}$, and the last isomorphism is literally by the universal property (replace each of $U,V, W$ respectively with $U^*,V^*,\Bbb{F}$ in your first diagram). The full isomorphism is such that an element $u\otimes v$ of the LHS is identified with the bilinear map $(\xi,\eta)\mapsto \xi(u)\cdot \eta(v)$.
Now, if you fix a single vector space $V$ over a field $\Bbb{F}$, and integers $r,s\geq 0$, then we define the $(r,s)$-tensor space over $V$ to be
\begin{align}
T^r_s(V)&:= \underbrace{V\otimes \cdots \otimes V}_{\text{$r$ times}}
\otimes \underbrace{V^*\otimes \cdots \otimes V^*}_{\text{$s$ times}}.
\end{align}
By similar reasoning as above, if $V$ is finite-dimensional we see that (invoke double duality as needed):
\begin{align}
V^{\otimes r}\otimes (V^*)^{\otimes s} &\cong \text{Hom}^{r+s}\bigg((V^*)^r\times V^s\,\,;\,\,\Bbb{F}\bigg),
\end{align}
and the isomorphism is such that $v_1\otimes \cdots \otimes v_r\otimes \alpha^1\otimes \cdots \otimes\alpha^s$ on the LHS is identified with the multilinear map
\begin{align}
(\beta^1,\dots, \beta^r, w_1,\dots, w_s)\mapsto \beta^1(v_1)\cdots \beta^r(v_r)\cdot\alpha^1(w_1)\cdots \alpha^s(w_s).
\end{align}
Hopefully this clears up the issue of what the isomorphism are (I leave it to you to verify these really are isomorphisms). For your final question of whether tensors are multilinear maps, well, I feel like now we're getting into a terminology issue. To me a tensor is an element of a tensor product of two vector spaces. What is a tensor product? From the universal property, we know such a construction is unique up to a unique isomorphism (as always). Therefore, the question of what a tensor is, is rather unimportant I feel.
If you use the realization of a tensor product as a quotient of some free module, then your tensors are elements of that space. If on the other hand you're dealing with finite-dimensional spaces, then you can equivalently define tensors as certain multilinear maps, so to you a tensor could be a multilinear map.
My Introduction to Tensors and Opinion:
The first definition of tensor I encountered was the one with multilinear maps (as introduced in Spivak), and I found this straight-forward, and I'm glad this is the first definition I learnt, because I don't think I would have been able to digest the abstractness of the universal property definition. Also, this definition is rather "concrete", and very easy to state given only basic knowledge of linear algebra (i.e just knowing what a vector space, dual space and multilinear means) so it made it easy for me to see what tensors are and work with them very easily, even though that's simply not that relevant (it's just one of those things which gives one some comfort when first learning a new topic). This concrete realization of tensors meant I didn't have to take a long abstract-algebraic detour in the middle of my advanced calculus and physics course into universal properties, quotient vector spaces and so on. This context hopefully puts it into some perspective for why people introduce tensors on finite-dimensional spaces as multilinear maps, and hence say that "tensors are multilinear maps".
Also, on a real-vector space, such as $\Bbb{R}^n$, two very common and important examples of tensors are the inner product (a $(0,2)$ tensor) and the determinant/volume form (a $(0,n)$ tensor). These are already conveniently given to us in the form of multilinear maps. So on top of wanting to avoid excessive algebraic detours, the fact that very important tensors (and by extension, tensor fields on manifolds) are already presented in multilinear fashion further motivates the multilinear definition one usually encounters in differential geometry/physics, hence the frequently encountered statement "tensors are multilinear maps".
Best Answer
What distinguishes a tensor from a vector relies on understanding what you're talking about. Strictly speaking, tensors of a fixed rank form a vector space (over $\mathbf R$, say), and thus "tensors are vectors" for pure mathematicians who don't work in anything related to physics or differential geometry.
But nobody means anything like that when they bring up the issues of tensors vs. vectors. This is why "tensors are not vectors" for physicists, and it explains why you are getting the answers "yes" and "no" in the comments.
Look at how tensors are built. You start with a vector space $V$ and take a tensor product of tensor powers of $V$ and the dual space $V^*$, say $V^{\otimes n} \otimes (V^*)^{\otimes m}$. Elements of $V^{\otimes n} \otimes (V^*)^{\otimes m}$ are called tensors, elements of $V$ are called vectors, and elements of $V^*$ are called covectors. In this setting, where the word vector means "element of $V$", the objects in $V^{\otimes n} \otimes (V^*)^{\otimes m}$ are not vectors (unless $n = 1$ and $m = 0$). Vector is just a synonym for "element of $V$". That's all there is to it.
You ask in a comment why tensors can't be given an inner product. That's incorrect: they can, provided $V$ is given an inner product first. If $V$ and $W$ are both real vector spaces and they are each given an inner product $\langle \cdot,\cdot\rangle_V$ and $\langle \cdot,\cdot\rangle_W$, then there is a unique inner product on $V \otimes W$ for which $$ \langle v \otimes w,v'\otimes w'\rangle = \langle v,v'\rangle_V\langle w,w'\rangle_W $$ on elementary tensors $v \otimes w$ and $v' \otimes w'$ in $V \otimes W$. Check that. That is, figure out why there exists exactly one inner product on $V \otimes W$ having the above behavior on pairs of elementary tensors in $V \otimes W$.
By iterating this construction, when $V$ has an inner product $\langle \cdot,\cdot\rangle_V$, each tensor power $V^{\otimes n}$ has a unique inner product that looks like $$ \langle v_1 \otimes \cdots \otimes v_n,v_1'\otimes \cdots \otimes v_n'\rangle = \langle v_1,v_1'\rangle_V\cdots \langle v_n,v_n'\rangle_V $$ on pairs of elementary tensors $v_1 \otimes \cdots \otimes v_n$ and $v_1'\otimes \cdots \otimes v_n'$.
When $V$ is finite-dimensional, an inner product $\langle \cdot,\cdot\rangle_V$ on $V$ gives us an isomorphism $V \to V^*$ by $v \mapsto \langle\cdot,v\rangle_V$. Using this, we can transport the inner product on $V$ over to an inner product on $V^*$: for $\varphi$ and $\varphi'$ in $V^*$, simply declare $$ \langle \varphi,\varphi'\rangle_{V^*} := \langle v,v'\rangle_V $$ where $\varphi = \langle \cdot,v\rangle_V$ and $\varphi' = \langle \cdot,v'\rangle_V$. Now you can use this inner product on $V^*$ to define an inner product on $(V^*)^{\otimes m}$ by the method described above for creating an inner product on tensor powers of $V$ from an inner product on $V$ (just replace $V$ by $V^*$ everywhere). And then you can use the inner products on $V^{\otimes n}$ and $(V^*)^{\otimes m}$ to define an inner product on $V^{\otimes n} \otimes (V^*)^{\otimes m}$ by the method described up above for putting an inner product on $V \otimes W$ when $V$ and $W$ each have an inner product. Thus, when a finite-dimensional real vector space $V$ has an inner product, every space of tensors $V^{\otimes n} \otimes (V^*)^{\otimes m}$ for fixed $m$ and $n$ gets an inner product from the inner product on $V$. Concretely, for elementary tensors $$ t = v_1 \otimes \cdots \otimes v_n \otimes \varphi_1 \otimes \cdots \otimes \varphi_m $$ and $$ t' = v_1' \otimes \cdots \otimes v_n' \otimes \varphi_1' \otimes \cdots \otimes \varphi_m' $$ in $V^{\otimes n} \otimes (V^*)^{\otimes m}$, $$ \langle t,t'\rangle = \prod_{i=1}^n \langle v_i,v_i'\rangle_V \prod_{j=1}^m \langle w_j,w_j'\rangle_V, $$ where $\varphi_j = \langle \cdot,w_j\rangle_V$ and $\varphi_j' = \langle \cdot,w_j'\rangle_V$.