Any alternating $(r,s)$ tensor has a corresponding map that goes $\Lambda^r V \to \Lambda^s V$. Suppose $R \in \Lambda^r V$ and $\Sigma \in \Lambda^s V^*$. Then define $\underline T:\Lambda^r V \to \Lambda^s V$ such that
$$T(R, \Sigma) = \Sigma[ \underline T(R)]$$
The uniqueness of $\underline T$ can be proved by taking a "gradient" with respect to the vector space of $\Lambda^s V^*$.
Geometrically, $\underline T$ maps an $r$-vector (which corresponds to an $r$-dimensional subspace) to an $s$-vector, and the $s$-covector $\Sigma$ allows us to extract the components of $\underline T(R)$.
Regarding the Background:
The first two bullets are fine. The "copy paste" metaphor is interesting.
Bullet 3: I'm not quite sure what you're getting at with this excerpt
Any number of different matrices could represent the same vector, for instance the same 1 by 3 column vector could be represented as a 2 by 2 matrix with one slot being 0 and the elements switched around. However, you usually choose the representation that makes computations appear as they would for basic linear algebra with geometric vectors.
but of course the gist, i.e. that row/column vectors and matrices can be used to flexibly represent vector spaces, is correct.
Bullet 4: Really not sure what you're trying to get here. I'm not sure how to interpret the sentence "multiplication of b by a can loosely be interpreted as a as a function of b".
Bullet 5: Not sure what this is supposed to say. I think you're just explaining the notation $f:X \to Y$, but your wording and choices of notation are awkward.
Bullet 6: Mostly correct, but subtly wrong. The phrasing of "vector spaces imply the existence of elements of a field with it" bothers me. It's not clear what you mean by "these maps must exist in order for it to have been called a vector space". In the end, it seems like you're trying to say something like "The dual space $V^*$ of $V$ is the set of linear maps from $V$ to its underlying field. The elements of a dual space are called covectors". I think you have the right idea, it's just not very readable right now.
Regarding Tensors:
Bullet 1: I haven't heard "tense" used as a verb in this sense. The sentence Additionally, since the linear transformations from the tensor on each vector can be encoded a vector, tensors should also be able to be vectors, which means they have to be able to be part of a vector space is unclear.
Bullet 2: "A multilinear transformation contains multiple sets of linear transformation information, each of which can be considered a vector": Not clear what "contain" means here. I'm really not sure what exactly you're trying to convey in the rest of this paragraph.
Bullet 3: A tensor is defined as an element of the tensor product of any number of vector spaces. Otherwise fine.
Your last two bullets are fine.
What makes all of this really confusing is that in some contexts, it is convenient to think of tensors as multilinear maps, while in other contexts it is convenient to think of tensors as being elements of the fancy vector space that we call the "tensor product" of the input spaces. It is common in the context in the exposition of the relevant fields to completely ignore the alternate points of view.
I have found that in differential geometry, the multilinear map point of view is more common. I think that the "multidimensional array" point of view is most directly connected to this multilinear map definition of a tensor product.
The advantage of the more abstract definition via tensor products of spaces is that all of the maps that we care about are simply linear maps (or in the greater algebraic context, module homomorphisms).
Best Answer
It'll help to first discuss inner products. Do we denote them $\langle u,\,v\rangle$ or $\langle u|v\rangle$? It's an important notational distinction. Let's start with the first one, which just requires a sesquilinear positive-definite map from vector pairs to scalars. Once we have this in place, we can define $\langle u|$ as the linear map from $|v\rangle$ to $\langle u,\,v\rangle$, then write this map's evaluation $\langle u,\,v\rangle$ at $|v\rangle$ as $\langle u|v\rangle$. Doing this, we note the set of linear maps considered here is a vector space, and we call it the dual space of the original one. So now we identify each "vanilla" vector with a specific function, and that function gets called a covector.
Now we can turn to outer products: just as $\langle u|$ is a linear map, $|u\rangle\langle v|$ is a symbol for a sesquilinear map from a covector $\langle a|$ and vector $|b\rangle$ to the scalar $\langle a,\,u\rangle\cdot\langle b,\,v\rangle$, where $\cdot$ denotes multiplication of scalars.