Your proof that $A+B$ is convex for convex $A$ and $B$ is correct, the general case follows by induction.
If $A_i\subseteq\Bbb R^m$ for $i=1,...,n$ then $A_1\times...\times A_n$ is convex in $\Bbb R^{mn}$:
Let $a=(a_1,...,a_n)$ and $b=(b_1,...b_n)$ where $a_i,b_i\in A_i$ for each $i$. Then $tb+(1-t)a=(tb_1+(1-t)a_1,...,tb_n+(1-t)a_n)$ is an element of the product since all $A_i$ are convex.
Let
$$\sum_{i = 1}^n \alpha_i v_i + \sum_{j = 1}^r \beta_j x_j + \sum_{k = 1}^s \sigma_k y_k = 0 \tag{A}$$
Then let
$$\tag{B}v := \sum_{i = 1}^n \alpha_i v_i + \sum_{j = 1}^r \beta_j x_j$$
Then we have $v \in S_1$ and
$$-v = - \sum_{k = 1}^s \sigma_k y_k \in S_2 $$
So $v \in S_2$. This implies $v \in S_1 \cap S_2$. But then we have unique $\gamma_1, \cdots, \gamma_n$ such that
$$v = \sum_{i = 1}^n \gamma_i v_i \tag{C}$$
On the other hand, the linear combination of $v$ in equation $(\mathrm{B})$ is unique as well, because $B_1$ is a basis of $S_1$. Making the subsitution $\alpha_i = \gamma_i$, it follows immediately that
$$\beta_1 = \beta_2 = \cdots = \beta_r = 0 \tag{D} $$
Because of $(\mathrm{D})$, equation $(\mathrm{A})$ becomes:
$$\sum_{i = 1}^n \alpha_i v_i + \sum_{k = 1}^s \sigma_k y_k = 0 \tag{E} $$
But because $B_2$ is a basis, $B_2$ is also linearly independent. And it follows that $(\mathrm{E})$ implies
$$\alpha_1 = \alpha_2 = \cdots = \alpha_n = \sigma_1 = \sigma_2 = \cdots = \sigma_s = 0 $$
And this proves that the vectors of $B_4$ are linearly independent. $\blacksquare$
Bonus: As an immediate corollary of this problem we have
$$\mathrm{dim}\; (S_1 + S_2) = \mathrm{dim}\; S_1 + \mathrm{dim}\; S_2 - \mathrm{dim}\; (S_1 \cap S_2) $$
Best Answer
Since $A,B \in S_1$, then $(1-t)A + tB \in S_1$
Since $A,B \in S_2$, then $(1-t)A + tB \in S_2$
Hence $(1-t)A + tB$, which is the convex combiantion, belongs to the intersection.