Linear Algebra – Find a Basis of V Containing v and w

linear algebrasolution-verificationvector-spaces

Find a basis of $V$ containing $v$ and $w$, where $V=\mathbb{R}^4, v=(0,0,1,1), w=(1,1,1,1)$.

I am not sure how to begin so a hint would be appreciated. I suspect I must use the following fact: If $V$ is a finite dimensional vector space and if $U$ is a subspace of $V$, then any independent subset of $U$ can be enlarged to a finite basis of $U$.


Alright, here's my stab at an attempt for a solution:

$V_{\text{basis}}=\{(0,0,1,1),(1,1,1,1),(0,0,0,1),(1,0,0,0) \}$

Would this be correct? The two vectors I added to the basis were the standard basis vectors for $\mathbb{R}^4.$

Best Answer

Your solution works. Here's a way to approach the question systematically:

suppose you have two linearly independent vectors $v_1,v_2$. We know that if we include the vectors $e_1,e_2,e_3,e_4$ (that is, the standard basis vectors), then the set $\{v_1,v_2,e_1,e_2,e_3,e_4\}$ spans $\mathbb R^4$. We would like to extract a basis including $v_1$ and $v_2$ from this spanning set. In order to do so, put the matrix $$ \begin{bmatrix} | &| &| &| &| &| \\ v_1&v_2&e_1&e_2&e_3&e_4\\ | &| &| &| &| &| \end{bmatrix} $$ into reduced row echelon form.

For your example, this matrix is $$ \left[ \begin{array}{cccccc} 0 & 1 & 1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 1 & 0 & 0 \\ 1 & 1 & 0 & 0 & 1 & 0 \\ 1 & 1 & 0 & 0 & 0 & 1 \\ \end{array} \right] $$ The resulting rref matrix is $$ \left[ \begin{array}{rrrrrr} 1 & 0 & 0 & -1 & 0 & 1 \\ 0 & 1 & 0 & 1 & 0 & 0 \\ 0 & 0 & 1 & -1 & 0 & 0 \\ 0 & 0 & 0 & 0 & 1 & -1 \\ \end{array} \right] $$ Now, we select only the vectors whose columns correspond to pivots in the rref form. That is, we select $\{v,w,e_1,e_3\}$. We now have a basis of $\mathbb R^4$ including $v$ and $w$.

Note that the answer you receive changes depending on the order in which you place the vectors in the matrix. However, as long as you place your desired linearly independent subset first, this process is guaranteed to give you a valid basis containing that subset.

Note also that systematic approaches are only particularly useful if you don't have strokes of insight. If you have a set of vectors of which you would like to check linear independence, put vectors as columns vectors in a matrix. Then, take the determinant or row reduce. The vectors are independent iff the resulting rref is the identity matrix, which is true iff the resulting determinant is zero.


Verification of asker's solution:

First of all, you could check that the solution $\{v,w,e_1,e_4\}$ is received by row reducing the matrix $$ \begin{bmatrix} | &| &| &| &| &| \\ v_1&v_2&e_1&e_2&e_4&e_3\\ | &| &| &| &| &| \end{bmatrix} $$ Setting that aside, there are two good methods of checking that a set of vectors is linearly independent, using (in this case) the matrix

$$ M = \begin{bmatrix} | &| &| &| \\ v_1&v_2&e_1&e_4\\ | &| &| &| \end{bmatrix} = \left[ \begin{array}{cccc} 0 & 1 & 1 & 0 \\ 0 & 1 & 0 & 0 \\ 1 & 1 & 0 & 0 \\ 1 & 1 & 0 & 1 \\ \end{array} \right] $$

Method 1: row reduce

Take the matrix $M$, and find its reduced row echelon form. In this case, that comes out to $$ \begin{bmatrix} 1&0&0&0\\ 0&1&0&0\\ 0&0&1&0\\ 0&0&0&1 \end{bmatrix} $$ Verifying that the matrix has full rank, which means that the set is indeed linearly independent.

Along the same lines, you could row reduce until you have an upper triangular matrix. If the diagonal entries are non-zero, then your matrix is invertible.

Method 2: take the determinant

Now, there are several methods to computing the determinant of a matrix, and I'm not going to go too far into that. Suffice it to say that in this case, the determinant comes out to $-1$. Since this result is non-zero, we conclude that the matrix has full rank, which means that its column vectors are linearly independent.