[Math] Equivalent systems of Linear equation

linear algebraself-learning

I've just begun to re-learn linear algebra because is so important, the book that I chose is naturally the Hoffman's for a lot of reason.

Well, In the first chapter I'm stuck with the following, because is at the very beginning of the book so, concept as nullity and rank are not define yet, neither linear independence; using this it was my first idea, analyze by cases when the solution is trivial and when is not, but for that I need the concept of linearly independent. I'd to avoid a posterior knowledge.

Definition: Two systems of linear equation are said to be equivalent: if each equation in every system is a linear combination of the equations in the other system.

Prove that If two homogeneous systems of linear equation in two unknowns have the same solution, then they are equivalent.

Proof: Let us consider (1) $A_{i1}x_1+A_{i2}x_2=0$ ( $\,1\le i \le n$ ) and (2) $\,B_{k1}x_1+B_{k2}x_2=0\,$ ( $\,1\le k \le m$ ), and suppose that both homogeneous systems have the same solution. We have to show that (1) is equivalent to (2), i.e., each equation in (1) can be written as a linear combination of the equations in (2).

Suppose we select $m$-scalars $\,c_1,…,c_m \in \mathbb{F}$, multiply the $j^{th}$ equation in (2) by $c_j$ and then add. So, we have $\sum _{k=1}^m c_k B_{k1} x_1+ c_k B_{k2} x_2=0$.

And here is where I'm stuck, I know I have nothing. I think, maybe it's possible to use induction… Any suggestion? whatever would help

PS: I've already searched here and of course there are similar questions, but in every one uses a posterior knowledge and at this stage the book does not define rank, nullity, linear independence, etc. (I know is almost trivial with that, but I really like to know how the author thought in the answer).

Best Answer

Let's work out an easier case to get started. Suppose $ax+by=0$ and $cx+dy=0$. Think of these calculus III style. These equations say $\langle a,b \rangle \cdot \langle x,y \rangle =0$ and $\langle c,d \rangle \cdot \langle x,y \rangle =0$. This means $\langle a,b \rangle$, $\langle a,b \rangle$ are both perpendicular to $\langle x,y \rangle$. Therefore, it is geometrically clear that there exists $k \in \mathbb{R}$ such that $\langle a,b \rangle =k\langle c,d \rangle$ thus $ax_1+bx_2 = k(cx_1+dx_2)$ (which is what you want to show for two equations).

Generally, I do agree that time is better spent on structure here. Much more can be gained by using LI and matrix structure. That is after all the difference between now and 200 years ago. We have the benefit of not just thinking in equations.