Linear Algebra – What Is a Basis

linear algebramatricesvectors

I have a brief understanding of bases. But I don't know if it is right or not. So, I just need someone to correct me if it's not.

When we look for the basis of the image of a matrix, we simply remove all the redundant vectors from the matrix, and keep the linearly independent column vectors. When we look for the basis of the kernel of a matrix, we remove all the redundant column vectors from the kernel, and keep the linearly independent column vectors.

Therefore, a basis is just a combination of all the linearly independent vectors.

By the way, is basis just the plural form of base?

Let me know if I am right.

Best Answer

What is a basis?

Informally we say

A basis is a set of vectors that generates all elements of the vector space and the vectors in the set are linearly independent.

This is what we mean when creating the definition of a basis. It is useful to understand the relationship between all vectors of the space. They all will have something in common: they can be written as a linear combination of some set of vectors that lies in the space. The set of vectors are called the base of the vector space.

How to make this notion formal?

For that, we use the theory of linear algebra. We define what is a vector and what we mean by a vector been generated by other vectors. We say that if a vector is some linear combination of other vectors - with respect to elements of some field (a vector space must have a field in the definition, usually this field is $\mathbb{R}$ or $\mathbb{C}$) - then this vector is generated. In some sense then we find first the set off vectors that generates all vectors in space (can be an infinite or a finite set).

Then the theory of linear independence plays the key role: Two vectors can generate and be generated by other vectors. So we talk about linearly independence when we want that the set that generates the space become

The smallest set of vectors that generates the space

So if I have a set of vectors that generates the space and one - or more - of these vectors is generated by other vectors, then I take this vector out of the set. And in some sense, given that I have already a base of my space, if I take out some vector of my base then I cannot generate all vector space anymore! For example we have $\mathbb{R}^2$ and the basis vectors $(0,1)$ and $(1,0)$; we cannot generate $(0,1)$ by a linear combination of $(1,0)$. But of course we can generate all vectors $(a,0)$ for $a \in \mathbb{R}$ using the vector $(1,0)$.

But this is not a unique notion!

It is not! A vector space can have multiple different bases. For example we have for $\mathbb{R}^2$ we have that $\{(1,0),(0,1)\}$ is a basis and we also get that $\{(3,0),(0,5)\}$ is also a basis. But the important notion is that we can create all vectors (points) in the space $\mathbb{R}^2$ using these sets. So in some sense, the basis tells us important things about the space: tells a relation between all vectors; tells how to create an vector; say how we can introduce more profound things about the space such as linear transformations between vector spaces that are different.

Related Question