The relation between the high school definition of a vector and the abstract algebra definition

linear algebravector-spacesvectors

I'm studying linear algebra, and I've been getting to grips with the idea of groups and fields and vector spaces. From what I understand, to be a vector is just to be an element of a vector space, and all sorts of unusual (to the beginner) things turn out to be vector spaces. For example, the set of all real-valued functions is a vector space over $\mathbb{R}$. I take it this means that all real-valued functions (e.g. $\sin x$) are vectors over $\mathbb{R}$. But how does this abstract definition fit with the earlier account we are given of vectors as quantities with magnitude and direction or directed line segments?

Best Answer

During one's original study of vectors, while thinking of them as "quantities with magnitude and direction" or as "directed line segments", one slowly learns various properties of vectors involving the operations of arithmetic.

For example, one learns that two vectors in the same vector space (i.e. in $2$-dimensional space $\mathbb R^2$ or in $3$-dimensional space $\mathbb R^3$) can be added to produce another vector, using the parallelogram law of addition. Abstracting this slightly one simply says

Addition is a binary operator on vectors.

One also learns that it doesn't matter what order you add the two vectors in, the result of that operation will turn out the same. Abstractly,

Vector addition is commutative.

Just like with numbers, if you want to add three or more vectors, you have to do it two-at-a-time by inserting parentheses to group terms, but it doesn't matter how you insert those parentheses:

Vector addition is associative.

One next learns that there is actually an exception to the rule that a vector is a "directed line segment" or a "quantity with magnitude and direction", because there is a very special vector $\vec O$ which has no magnitude and points nowhere. You can still add this vector to other vectors, but the outcome is special: $\vec O + \vec V = \vec V$ no matter what $\vec V$ is.

There is an identity element of vector addition.

Now, I could go on in this vein, continuing with a slow explanation of how scalar multiplication works, and then writing down the abstract laws of scalar multiplication. But perhaps you have seen the pattern: the abstract laws that I am writing down are building up to be the abstract definition of a vector space. The above four laws are already about half of the definition of a vector space; I'm sure you can carry out the rest of this process with scalar multiplication.

Abstraction is a central process in the human development of mathematics. We look at familiar objects (e.g. vectors), we study their properties, if we're clever enough we abstract those properties, and then we notice: Hey! There are many other settings where those properties hold! I think I'll call those settings vector spaces!

Related Question