Basis of sum of two vector spaces

linear algebravector-spaces

Find basis of sum of two vector spaces $ V_1 + V_2 $, where $V_1$ is set of generators:

\begin{split}
V_1 = \space <\begin{bmatrix} 1 \\ 3 \\ 5 \\ -3 \end{bmatrix}, \begin{bmatrix} 1 \\ -1 \\ 1 \\ 5 \end{bmatrix}, \begin{bmatrix} 3 \\ 1 \\ 7 \\ 7 \end{bmatrix} >
\end{split}

And $ V_2 $ is space of solutions of system of equations:

\begin{split}
-2x_1 + x_2 = 0 \\
2x_1 + x_2 – x_3 = 0
\end{split}

I found basis of $ V_1 $ (Gauss elimination on matrix made from vectors) equals:

\begin{split}
B_1 = \{{\begin{bmatrix} 1 \\ 3 \\ 5 \\ -3 \end{bmatrix}, \begin{bmatrix} 1 \\ -1 \\ 1 \\ 5 \end{bmatrix}}\}
\end{split}

And base for $ V_2 $:

\begin{split}
B_2 = \{\begin{bmatrix} \frac{1}{4} \\ \frac{1}{2} \\ 1 \\ 0 \end{bmatrix}, \begin{bmatrix} 0 \\ 0 \\ 0 \\ 1 \end{bmatrix}\}
\end{split}

But know I have to find basis of sum $ V_1 + V_2 $ and basis of intersection $V_1 \cap V_2$ and I dont't know how to do that. Could someone explain it? Thanks in advance!

Best Answer

In general, a basis for the sum $V_1 + V_2$ can be found as follows:

  1. Find a basis for $V_1$ and a basis for $V_2$. Say, $B_1$ is a basis for $V_1$ and $B_2$ is a basis for $V_2$.

  2. Put all the vectors in $B_1$ and $B_2$ as the columns of a matrix. Reduce this matrix.

  3. Look at which columns are pivot columns. Then, the corresponding columns in the original matrix will form a basis for $V_1 + V_2$. (For example, if you reduce the matrix, and you find that columns 1 and 3 are the pivot columns, then columns 1 and 3 of the original matrix will form a basis for $V_1 + V_2$).

Why does this work? Well, note that the vectors in $B_1$ and $B_2$ together span $V_1 + V_2$. Now if we look at the algorithm for finding a basis from a spanning list, this justifies steps 2-3.


In general, to find a basis for $V_1 \cap V_2$, you can take a look at some of the answers here. The idea is as follows:

  1. Let $A = ( B_1 \mid -B_2)$, where $B_1$ is a basis of $V_1$ and $B_2$ is a basis of $V_2$. That is, $A$ is the matrix whose columns are the vectors of $B_1$ and minus $B_2$.

  2. Find a basis for the null space of $A$ (there are many answers on this site about how to find the null space of a matrix), which will be a list of vectors $\begin{pmatrix} {x_1} \\ {y_1}\end{pmatrix}, …, \begin{pmatrix} {x_n} \\ {y_n}\end{pmatrix}$ where the ${x_i}$’s and ${y_i}$’s are vectors themselves. The length of each $x_i$ is the number of vectors in $B_1$, and the length of each $y_i$ is the number of vectors in $B_2$.

  3. Next, let $w_i := Ux_i = Vy_i$, where $U$ is the matrix $(B_1)$, and $V$ is the matrix $(B_2)$. (That is, $U$ is the matrix whose columns are the vectors in $B_1$, and $V$ is the matrix whose columns are the vectors in $B_2$.) Then $w_1, …, w_n$ is a basis for $V_1 \cap V_2$.

Why does this work? Well, first observe that for a vector $z$ to be in the intersection $V_1 \cap V_2$ is equivalent to saying that $z$ is in the intersection $\text{Range}(U) \cap \text{Range}(V)$. So to show that $w_1, …, w_n$ is a basis for $V_1 \cap V_2$, we need to show that (1) $\text{span($w_1, …, w_n$)} = \text{Range}(U) \cap \text{Range}(V)$, and that (2) $w_1, …, w_n$ is linearly independent.

Let’s show (1): Clearly, the span is contained in the intersection, because each $w_i$ is by definition in the intersection and intersections of subspaces are closed under linear combinations. So this proves $\subseteq$. To show $\supseteq$, suppose $z \in \text{Range}(U) \cap \text{Range}(V)$. Say, $z = Ux’ = Vy’$ for some $x’, y’$. But $Ux’ = Vy’$ means that $Ux’ - Vy’ = 0$, which means that $\begin{pmatrix}x’ \\ y’ \end{pmatrix}$ is in the null space of $A$. So, $\begin{pmatrix}x’ \\ y’ \end{pmatrix}$ is a linear combination of $\begin{pmatrix} {x_1} \\ {y_1}\end{pmatrix}, …, \begin{pmatrix} {x_n} \\ {y_n}\end{pmatrix}$. Applying $U$ to $x’$ gives, by linearity, a linear combination of $Ux_1, …, Ux_n$, which is just $w_1, …, w_n$, and therefore $Ux’$ (which is $z$) is in the span of $w_1, …, w_n$.

Let’s show (2): Suppose $a_1w_1 + … + a_nw_n = 0$. In other words, $U(a_1x_1 + … + a_nx_n) = 0 = V(a_1y_1 + … + a_ny_n)$. Since $U$ and $V$ were full rank matrices (their columns were defined to be vectors of a basis, so automatically they are full rank), their null spaces are both $0$. Therefore, $a_1x_1 + … + a_nx_n = 0 = a_1y_1 + … + a_ny_n$. This implies $a_1\begin{pmatrix} {x_1} \\ {y_1}\end{pmatrix} +…+a_n \begin{pmatrix} {x_n} \\ {y_n}\end{pmatrix} = 0$. But since $\begin{pmatrix} {x_1} \\ {y_1}\end{pmatrix}, …, \begin{pmatrix} {x_n} \\ {y_n}\end{pmatrix}$ is linearly independent, this implies the $a_i$’s are all $0$. Therefore $w_1, …, w_n$ is linearly independent.

Related Question