[Math] How determine if a set form a basis for a vector space

linear algebra

In the book I am studying, the definition of a basis is as follows:

If $V$ is any vector space and $S= \{ \textbf{v}_1,…,\textbf{v}_n \}$ is a finite set of vectors in $V$, then $S$ is called a basis for $V$ if the following two conditions hold:
(a) $S$ is lineary independent
(b) $S$ spans $V$

I am currently taking my first course in linear algebra and something about the notion of a basis bugs me. The reason is that I feel like one would need a basis to investigate if the conditions (a) and (b) holds, but how do one find this first basis?

As a dummy example from the book, the author shows that the standard basis for $R^3$ is in fact a basis for $R^3$ in (roughly) the folowing way:

"We need to show that
$\textbf{i} = (1,0,0),\quad \textbf{j} = (0,1,0), \quad \textbf{k} = (0,0,1)$
are lineary independent and that they span $R^3$.

To show that they are lineary independent, we note that if they are lineary dependent, then the equation
$t_1(1,0,0) + t_2(0,1,0) + t_3(0,0,1) = (0,0,0)$
would be satisfied by some $t_r \neq 0$. But equating corresponding components shows that this is equivalent to saying that
$t_r = 0, \quad t_r\neq 0$
which is a contradiction.

To show that condition (b) holds, we note that an arbitrary vector $\textbf{v}$ in $R^3$ can be written as
v $=t_1(1,0,0) + t_2(0,1,0) + t_3(0,0,1)$"

Ok, so the thing that confuses me is that in this example, to show that the standard basis form a basis, we have used… the standard basis? I mean, when stating that $\textbf{i} = (1,0,0)$ we are basically saying that $\textbf{i}$ actually $\textbf{is}$ one of the unit vectors which forms the standard basis right? But then we are showing that the standard basis is a basis by assuming that it is a basis from the beginning, which does not make sense.

I am going to try to summarise this rather vague post with some concrete questions, but feel free to adress something that you feel is relevant to my post.

1.) Is it possible to prove that a standard basis in fact form a basis, or should one take this as kind of an axiom?

2.) If (1.) is true, what does the components of a vector actually mean before you have proven that there exist a "first" basis?

Best Answer

It is possible to prove that standard basis forms a basis. First of all standard basis is the most obvious way of expressing that vectors are linearly independent. $\textbf{i} = (1,0,0),\; \textbf{j} = (0,1,0), \; \textbf{k} = (0,0,1)$ are linearly independent because $t_1(1,0,0) + t_2(0,1,0) + t_3(0,0,1) = (0,0,0)$ iff $t_1=t_2=t_3=0$. (You can think them as columns of identity matrix to see that it is the only possible case)

Additionally,

$$t_1\begin{pmatrix} 1 \\ 0 \\ 0 \end{pmatrix}+t_2\begin{pmatrix} 0 \\ 1 \\ 0 \end{pmatrix}+t_3\begin{pmatrix} 0 \\ 0 \\ 1 \end{pmatrix}=\begin{pmatrix} t_1 \\ t_2 \\ t_3 \end{pmatrix}$$

So linear combinations of these vectors are determined by the values of $t_1,t_2,t_3$ and the arbitrary values of $t_1,t_2,t_3$ forms $\mathbb R^3$.

Related Question