Regarding the Background:
The first two bullets are fine. The "copy paste" metaphor is interesting.
Bullet 3: I'm not quite sure what you're getting at with this excerpt
Any number of different matrices could represent the same vector, for instance the same 1 by 3 column vector could be represented as a 2 by 2 matrix with one slot being 0 and the elements switched around. However, you usually choose the representation that makes computations appear as they would for basic linear algebra with geometric vectors.
but of course the gist, i.e. that row/column vectors and matrices can be used to flexibly represent vector spaces, is correct.
Bullet 4: Really not sure what you're trying to get here. I'm not sure how to interpret the sentence "multiplication of b by a can loosely be interpreted as a as a function of b".
Bullet 5: Not sure what this is supposed to say. I think you're just explaining the notation $f:X \to Y$, but your wording and choices of notation are awkward.
Bullet 6: Mostly correct, but subtly wrong. The phrasing of "vector spaces imply the existence of elements of a field with it" bothers me. It's not clear what you mean by "these maps must exist in order for it to have been called a vector space". In the end, it seems like you're trying to say something like "The dual space $V^*$ of $V$ is the set of linear maps from $V$ to its underlying field. The elements of a dual space are called covectors". I think you have the right idea, it's just not very readable right now.
Regarding Tensors:
Bullet 1: I haven't heard "tense" used as a verb in this sense. The sentence Additionally, since the linear transformations from the tensor on each vector can be encoded a vector, tensors should also be able to be vectors, which means they have to be able to be part of a vector space is unclear.
Bullet 2: "A multilinear transformation contains multiple sets of linear transformation information, each of which can be considered a vector": Not clear what "contain" means here. I'm really not sure what exactly you're trying to convey in the rest of this paragraph.
Bullet 3: A tensor is defined as an element of the tensor product of any number of vector spaces. Otherwise fine.
Your last two bullets are fine.
What makes all of this really confusing is that in some contexts, it is convenient to think of tensors as multilinear maps, while in other contexts it is convenient to think of tensors as being elements of the fancy vector space that we call the "tensor product" of the input spaces. It is common in the context in the exposition of the relevant fields to completely ignore the alternate points of view.
I have found that in differential geometry, the multilinear map point of view is more common. I think that the "multidimensional array" point of view is most directly connected to this multilinear map definition of a tensor product.
The advantage of the more abstract definition via tensor products of spaces is that all of the maps that we care about are simply linear maps (or in the greater algebraic context, module homomorphisms).
Best Answer
What is a tensor? In short, a tensor is a generalization of a vector which is needed to express physical quantities which have more data than we can fit into a single vector field. However, it's more than that. We also need tensors of different transformation type. Ultimately, in physics, we wish to write equations which are independent of the choice of coordinates. Yet, we use coordinates. So, this brings you to the focus on components which transform inversely. With objects whose transformation properties are mirrored we are able to create scalars which are invariant. In math, a tensor product of vector spaces is a way of multiplying spaces. Or, for specific matrices, the tensor product is the Kronecker product which is pretty easy to understand calculationally; for $A \otimes B$ we just make a new matrix with blocks formed by $AB_{ij}$. A tensor is simply a multilinear mapping on a vector space and its dual. The tensor products of the basis and dual basis of the vector space are used to build a natural basis for the tensors over the given vector space. Of course, then from a manifold perspective, this is all just "at a point", we then wish to consider tensor fields... of course there is much to learn. I think Lawden is a good book, I used it in a General Relativity course, it was readable and I got a good amount out of it. As a beginning physics student, I used the big black book Gravitation by Misner Thorne and Wheeler. There's about 100-200 pages of plain old tensors and forms which are helpful, lots of exercises. From a physics perspective it was good. From a math perspective, it's not optimal. I thought the math in Sean Carrol's General Relativity text was also quite good, and a bit more modern than MTW.
All of this said, I suspect the book that you would enjoy is: Manifolds, Tensors, and Forms: An Introduction for Mathematicians and Physicists by Paul Rentein.
It's not just about tensors, and, I think that's a good thing. Tensors are part of a larger story and this book is written for someone with your general leaning.