I presume your second system was meant to be
$$3x_1 + x_2 = 0\\
x_1 + x_2 = 0\ ,$$
and you want to know a systematic way of determining the coefficients $\frac{1}{3} \mbox{ and }\frac{4}{3}$ in your equation
$$3x_1 + x_2 = \frac{1}{3}(x_1-x_2) + \frac{4}{3}(2x_1 + x_2)\ ,$$
and corresponding coefficients in a similar expression for your fourth equation in terms of the first two. But it's not at all clear to me why you would want to do this, and if that isn't your question, then you'll need to state it more clearly.
To obtain the coefficients in your expression, simply treat them as unknowns $\alpha_1$ and $\alpha_2$ , for which you can obtain two equations from the expression they have to satisfy. You want your third equation to be expressed as $\alpha_1$ times the first plus $\alpha_2$ times the second:
$$3x_1 + x_2 = \alpha_1 (x_1-x_2) + \alpha_2(2x_1 + x_2)\ .$$
Now rearrange the equation on the left so you have $x_1$ and $x_2$ multiplied by a single coefficient:
$$3x_1 + x_2 = (\alpha_1 + 2\alpha_2)\, x_1 + (-\alpha_1 + \alpha_2)\, x_2\ .$$
For this equation to be satisfied, you now want the coefficients of $x_1$ and $x_2$ on the left to have the same value as they do on the right. That is:
\begin{eqnarray}
3 &=& \alpha_1 + 2\alpha_2 \ \mbox{, and}\\
1 &=& -\alpha_1 +\alpha_2\ ,
\end{eqnarray}
which you can now solve by the method of (Gaussian) elimination.
Per the discussion in the comments, it seems that your solution is incorrect.
Here is one correct approach to the problem. First of all, we consider $3$ separate cases.
Case 1: If every pair of numbers solves $E_1$, then it must hold that equations of $E_1$ are of the form $0x_1 + 0x_2 = 0$. If $E_2$ has the same set of solutions, it must also consist of only zero equations, which means that it is equivalent to $E_1$.
Case 2: Suppose that the only solution to $E_1$ is $(0,0)$. Show that $E_1$ is equivalent to the system of equations $E_0$ defined by
$$
1x_1 + 0x_2 = 0, \qquad 0x_1 + 1x_2 = 0.
$$
If $E_1$ and $E_2$ both have $(0,0)$ as their only solution, then we can say that $E_1$ is equivalent to $E_0$, which is equivalent to $E_2$. It follows that $E_1$ is equivalent to $E_2$.
Case 3: Suppose that $E_1$ has one non-zero equation, and also has a non-zero solution $x_1 = a, x_2 = b$. Show that $E_1$ is equivalent to the system of equations $E_0$ whose only equation is
$$
-bx_1 + ax_2 = 0.
$$
As before: $E_1$ is equivalent to $E_0$, which is equivalent to $E_2$. It follows that $E_1$ is equivalent to $E_2$.
Filling in the blanks:
Useful facts:
- Any single equation has a non-zero solution
- If two equations share a non-zero solution, then one is a multiple of the other
This is easy to understand if you think of each solution set to a non-zero equation $Ax + By = 0$ as a line through the origin: any single line contains a point outside the origin, and there is exactly one line between any two points.
A third fact:
- If $a,b$ are not both zero, then $cx_1 + dx_2$ is a multiple of $ax_1 + bx_2$ if and only if $ad = bc$.
Case 3: Because $(a,b)$ is a solution to $E_1$, we have
$$
A_{i1}a + A_{i2}b = 0 \implies A_{i1} \cdot a = A_{i2} \cdot (-b).
$$
By point 3 above, this means that the $i$th equation is a multiple of $-bx_1 + ax_2 = 0$. In other words, every equation in $E_1$ is a multiple of the only equation in $E_0$. Conversely, we that $E_1$ must have a non-zero equation (say the $i$th equation), and the equation of $E_0$ is a multiple of the $i$th equation of $E_1$. So, $E_0$ and $E_1$ are equivalent.
Case 2: We want to show that every equation in $E_1$ can be written as a linear combination of the equations in $E_0$, and that every equation in $E_0$ can be written as a linear combination of the equations in $E_1$.
The first statement is easy to show (try it, let me know if you're stuck). For the second statement: we know that $E_1$ has $(0,0)$ as its only solution. Using points 1 and 2 above, we argue that $E_1$ contains at least two non-zero equations that are not multiples of each other. In other words, there must exist $i,j$ such that $A_{i1}A_{j2} \neq A_{i2}A_{j1}$ (a convenient way to rewrite $\frac{A_{1i}}{A_{2i}} \neq \frac{A_{j1}}{A_{j2}}$ that still works if one of the denominators is zero).
Let $D = A_{i1}A_{j2} - A_{i2}A_{j1}$. Verify that adding $\frac{A_{j2}}{D}$ times the $i$th equation and $-\frac{A_{i2}}{D}$ of the $j$th equation produces one equation of $E_0$. Adding $-\frac{A_{j1}}{D}$ of the $i$th equation and $\frac{A_{i1}}{D}$ oof the $j$th equation produces the second.
So, $E_1$ is equivalent to $E_0$.
Best Answer
Sure they are equivalent. But what does it mean to “express each equation in each systems as a linear combination of the equations in the other system.” We may express a vector as a linear combination of other vectors, but what about equations? Surely they are no vectors.
Have a look of the lhs of the first equation, namely $x_1-x_2$. This may be viewed as a linear form sending a vector $(a,b)$ to the complex number $a-b$ and the solution of the equation $x_1-x_2=0$ is the kernel of that linear form.
Now the solution set of a homogenous linear system is the intersection of the kernels of the linear forms involved. So the task now is to express the linear forms of the first system in terms of the linear form of the second system. Since a finite dimensional vector space is isomorphic to its dual space, you may view the linear form $x_1-x_2$ as the vector $(1,-1)^T$. To express the first linear form of the second system you have to solve $$r\begin{pmatrix}1\\ -1\end{pmatrix}+s\begin{pmatrix}3\\ 1\end{pmatrix}= \begin{pmatrix}2\\ 1\end{pmatrix}$$ to find $$\frac{-1}{4}\begin{pmatrix}1\\ -1\end{pmatrix}+\frac{3}{4}\begin{pmatrix}3\\ 1\end{pmatrix}= \begin{pmatrix}2\\ 1\end{pmatrix}.$$