Note that you should explicit the vectors spanning the sum a little more.
Note that sometimes could be easier think to $L_{1},L_{2}$ as solution of $$\begin{pmatrix}1 & -1 & 0 & -1 \\ 1 & 1 & 1 & 1 \\ 0 & 1 & -1 & 0 \\ 0 & 0 & 0 & 0 \\\end{pmatrix}\begin{pmatrix}x_{1} \\ x_{2} \\ x_{3} \\ x_{4}\end{pmatrix} = \begin{pmatrix}0 \\ 0 \\ 0 \\ 0\end{pmatrix}$$ And $$\begin{pmatrix}1 & -1 &1 & -1 \\ 1 & 1 & 0 & 0 \\ 0 & 0 & 0 & 0 \\ 0& 0 & 0 & 0\end{pmatrix}\begin{pmatrix}x_{1} \\ x_{2} \\ x_{3} \\ x_{4}\end{pmatrix} = \begin{pmatrix}0 \\ 0 \\ 0 \\ 0\end{pmatrix}$$
This representation could be useful in order to estimate apriori the dimension of those subspaces, since linear independance can be checked quite easily.
What we notice with this is representation is that we can study $L_{1},L_{2}$ as Kernel of those associated matrix, so the dimension can be studied through theorems or known formulas as the rank-nullity Theorem.
Additionally I think a good approach could be estimate the dimension of the intersection and determine a basis using Grassmann's identity
$$\dim(L_{1}+L_{2}) = \dim(L_{1})+\dim(L_{2})-\dim({L_{1} \cap L_{2}})$$
From here you can immediately observe that since $L_{1}$ has at least dimension $1$, but since we have a $3 \times 3$ invertible minor is at most one, thanks to the rank nullity theorem, and $L_{2}$ has at least dimension $2$ (the last two rows), and at most since we got a $2\times 2$ invertible minor), so the sum could be direct since our space is $\mathbb{R}^{4}$ but the sum of the dimension is $3$.
Since the intersection is a subspace of both $L_{1},L_{2}$ it has dimension $0$,$1$ since it has to be in particular a subspace of $L_{1}$.
To determine the exact dimension we can proceed as above, taking a basis of $L_{1},L_{2}$ taking the matrix spanned by those vector and calculate the rank : if the rank is $3$ they are linearly independant, hence the intersection is trivial and the sum is direct, otherwise the intersection is not trivial and has to coincide with $L_{1}$ in that case in order to find a basis you just have to take a vector in $w \in L_{2} : w \not\in L_{1}$ and extend a basis of $L_{1}$ with $w_{2}$ to get a basis for the sum.
If you write your systems of equations as a matrix as follows:
$$A \vec{x} = \begin{bmatrix} 1 & -3 & 2 \\ 1 & 3 & -2 \\ 0 & -6 & 4 \end{bmatrix} \begin{bmatrix} x_1 \\ x_2 \\ x_3 \end{bmatrix} = \begin{bmatrix} -2 \\ 5 \\ 3\end{bmatrix} = \vec{b}$$
then here is a (perhaps) quicker way to determine if the picture looks like the triangle. Note: I don't know how comfortable you are with basic linear algebra concepts, but you only need them to understand the proof of why this is correct. You can apply the method without any understanding of them.
$1$. If all three normal vectors of the planes are multiples of the same vector, then you can immediately conclude you have three parallel
planes (and not the triangle).
$2$. If exactly two normal vectors are multiples of the same vector, then you can immediately conclude you don't have the triangle.
Instead, you have one plane that is cut by two parallel planes.
$3$. If none of the normal vectors are multiples of each other, then it's possible you have the triangle. As you noted, the normal vectors
must be in the same plane, i.e. linearly dependent, so it must follow
that $\det(A) = 0$. If this isn't the case, then you can immediately
conclude that the planes intersect in one point.
$4$. If there is a solution, then $\vec{b}$ should be a linear combination of two linearly independent columns of $A$. (This is because $A \vec{x}$ is just a linear combination of $A$'s columns. If there is a
solution to $A \vec{x} = \vec{b}$ and $A$ has two linearly independent
columns, then $\vec{b}$ should be able to be written as a linear
combination of just those two columns.) Thus, if we replace a linearly
dependent column (i.e. one that can be expressed as a linear
combination of the others) of $A$ with the vector $\vec{b}$ to create
the matrix $A'$, for there to be no solution (i.e. the "triangle"
configuration) it must be the case that $\det(A') \neq 0$. If
$\det(A') = 0$, then you can conclude you have three planes
intersecting in one line (the second picture you've posted).
Fortunately, choosing a linearly dependent column is easy. You
just need to make sure to a) replace a zero column with $\vec{b}$ if
$A$ has a zero column or b) if there are two columns that are (nonzero)
multiples of each other, then replace one of them with $\vec{b}$. And
if none of a) or b) is the case, then you can choose any column.
Example: I'll work thru the steps above with the example you've written.
Steps $1$ and $2$. I can immediately notice that none of normal vectors of the planes are parallel. So we proceed to step $3$.
Step $3$. We can calculate
$$\det(A) = (1)(12 - 12) - (-3)(4 - 0) + 2(-6 - 0) = 0$$
so we proceed to step $4$. Note that if you were able to observe that the third row of $A$ was a linear combination of the first and second row (the third row is simply the first row minus the second row) or that the third column was a multiple of the second column, you could immediately skip to step $4$.
Step $4$. We can notice that none of the columns are zeroes (case a), but in fact the last two columns are multiples of each other. So case b) applies here, and we have to exchange one of the last two columns with $\vec{b}$ for the process to be correct. Let's replace the last column of $A$ with $\vec{b}$ to obtain $A'$:
$$A' = \begin{bmatrix} 1 & -3 & -2 \\ 1 & 3 & 5 \\ 0 & -6 & 3 \end{bmatrix}$$
and we can calculate
$$\det (A') = (1)(9 + 30) - (-3)(3 - 0) + (-2)(-6 - 0) = 29 + 9 + 12 = 60 \neq 0$$
and hence we can conclude we have the "triangle" configuration.
Conclusion: I think this method is somewhat easier than calculating the three intersection lines. It requires you to calculate two determinants of $3 \times 3$ matrices instead.
Best Answer
You can't conclude it from $t\ne s$, but in the third coordinate you ask that $3+ 0t=1+0s$ which cannot be satisfied and from that you can. You say these are $0$ but the first is $3$