Show that this set is a basis for $S_1+S_2$

linear algebra

Let $$B_1 = \{v_1, \dots v_n, x_1 \dots x_r\}$$
$$B_2=\{v_1 \dots v_n, y_1 \dots y_s\}$$
$$B_3 = \{v_1 \dots v_n\}$$

be basis for subspaces $S_1$ , $S_2$ and $S_1 \cap S_2$ respectively. Show that the set
$$B_4 = \{v_1 \dots v_n , x_1 \dots x_r, y_1 \dots y_s\}$$
is a basis for $S_1 + S_2$.

I have managed to prove that the set spans $S_1 + S_2$ what I'm having trouble on is showing that the set is linearly independent. So here is what I did..

Let $$\underbrace{\sum_{i=1}^n \alpha_iv_i}_{\alpha} + \underbrace{\sum_{j=1}^r \beta_jx_j}_{\beta} + \underbrace{\sum_{k=1}^s \sigma_k y_k}_{\sigma} = 0$$
and so we wish to show that all the $\alpha_i$ , $\beta_j$ and $\sigma_k$ are zero.

So i consider the case when $\alpha + \beta =0$ and so since $B_1$ is a basis we get that the each $\alpha_i$ and $\beta_j$ are zero. and that $\sigma = 0$. Now since the set $\{y_1 \dots y_s\}$ is a subset of the basis $B_2$ they must be linearly independent also so the $\sigma_k$ are all zero.

For the other case we have the $\alpha + \beta \neq 0$ which implies that $\sigma = -\beta – \alpha$. In other words the linearly independent set $\{y_1 \dots y_s\}$ spans $B_1$ making it a smaller basis which is a contradiction.

My question is mainly is there a more elegant proof of this fact that doesn't rely on case work and if not then is the proof I provided a valid proof. Thanks in advance!

Best Answer

Let

$$\sum_{i = 1}^n \alpha_i v_i + \sum_{j = 1}^r \beta_j x_j + \sum_{k = 1}^s \sigma_k y_k = 0 \tag{A}$$

Then let

$$\tag{B}v := \sum_{i = 1}^n \alpha_i v_i + \sum_{j = 1}^r \beta_j x_j$$

Then we have $v \in S_1$ and

$$-v = - \sum_{k = 1}^s \sigma_k y_k \in S_2 $$

So $v \in S_2$. This implies $v \in S_1 \cap S_2$. But then we have unique $\gamma_1, \cdots, \gamma_n$ such that

$$v = \sum_{i = 1}^n \gamma_i v_i \tag{C}$$

On the other hand, the linear combination of $v$ in equation $(\mathrm{B})$ is unique as well, because $B_1$ is a basis of $S_1$. Making the subsitution $\alpha_i = \gamma_i$, it follows immediately that

$$\beta_1 = \beta_2 = \cdots = \beta_r = 0 \tag{D} $$

Because of $(\mathrm{D})$, equation $(\mathrm{A})$ becomes:

$$\sum_{i = 1}^n \alpha_i v_i + \sum_{k = 1}^s \sigma_k y_k = 0 \tag{E} $$

But because $B_2$ is a basis, $B_2$ is also linearly independent. And it follows that $(\mathrm{E})$ implies

$$\alpha_1 = \alpha_2 = \cdots = \alpha_n = \sigma_1 = \sigma_2 = \cdots = \sigma_s = 0 $$

And this proves that the vectors of $B_4$ are linearly independent. $\blacksquare$

Bonus: As an immediate corollary of this problem we have

$$\mathrm{dim}\; (S_1 + S_2) = \mathrm{dim}\; S_1 + \mathrm{dim}\; S_2 - \mathrm{dim}\; (S_1 \cap S_2) $$

Related Question