Each linear equation represents a line in the plane. Most of the time two lines will intersect in one point, which is the simultaneous solution you seek. If the two lines have exactly the same slope, they may not meet so there is no solution or they may be the same line and all the points on the line are solutions. When you add a third equation into the mix, that is another line. It is unlikely to go through the point that solves the first two equations, but it might.
That's one of the main reasons why linear algebra was invented!
First we translate the problem into matrices: if
$$
\mathbf{A}=\begin{bmatrix}
1 & 1 & 1 \\
1 & 1 & 2 \\
1 & 1 & 3
\end{bmatrix}
\qquad
\mathbf{x}=\begin{bmatrix} x \\ y \\ z \end{bmatrix}
\qquad
\mathbf{b}=\begin{bmatrix} 1 \\ 3 \\ -1 \end{bmatrix}
$$
then the system can be rewritten as $\mathbf{A}\mathbf{x}=\mathbf{b}$. This is not really a great simplification, but allows using the unknowns as a “single object”.
A big advance is obtained by interpreting this in terms of linear maps. The matrix $\mathbf{A}$ induces a linear map $f_{\mathbf{A}}\colon\mathbb{R}^3\to\mathbb{R}^3$ defined by
$$
f_{\mathbf{A}}(\mathbf{v})=\mathbf{A}\mathbf{v}
$$
and now solvability of the linear system becomes the question
does the vector $\mathbf{b}$ belong to the image of $f_{\mathbf{A}}$?
The image $\operatorname{Im}(f_{\mathbf{A}})$ is a vector subspace of $\mathbb{R}^3$; if it has dimension $3$, then clearly the system is solvable. But what if the dimension is less than $3$?
This is the “obstruction” for the solvability: when the dimension of the image (the rank of the linear map and of the matrix $\mathbf{A}$) is less than the dimension of the codomain (in your case $3$) the system can be solvable or not, depending on whether $\mathbf{b}$ belongs to the image or not.
There is no “general answer” that allows just looking at $\mathbf{A}$ and $\mathbf{b}$ and tell whether the system is solvable. Rather, there are efficient techniques that show whether the system has a solution without actually solving it. A very good one is doing elementary row operations, because these correspond to multiplying both sides of the system by an invertible matrix. In the present case, we do
\begin{align}
\left[\begin{array}{ccc|c}
1 & 1 & 1 & 1 \\
1 & 1 & 2 & 3\\
1 & 1 & 3 & -1
\end{array}\right]
&\to
\left[\begin{array}{ccc|c}
1 & 1 & 1 & 1 \\
0 & 0 & 1 & 2\\
0 & 0 & 2 & -2
\end{array}\right]
&&\begin{aligned} R_2&\gets R_2-R_1 \\ R_3&\gets R_3-R_1 \end{aligned}
\\&\to
\left[\begin{array}{ccc|c}
1 & 1 & 1 & 1 \\
0 & 0 & 1 & 2\\
0 & 0 & 0 & -6
\end{array}\right]
&&R_3\gets R_3-2R_2
\end{align}
At this stage we know that the system is not solvable. We also know that the rank of $\mathbf{A}$ is $2$ and even that the image is spanned by the vectors
$$
\begin{bmatrix}1\\1\\1\end{bmatrix}
\qquad
\begin{bmatrix}1\\2\\3\end{bmatrix}
$$
This is easy for the present situation, but the method can be applied to systems of any size, not necessarily with as many equations as unknowns.
The same row elimination shows that if the vector $\mathbf{b}$ had been
\begin{bmatrix} 1 \\ 3 \\ 5 \end{bmatrix}
then the system would be solvable.
Seen in a different way, the system is solvable if and only if
$$
\mathbf{b}=\alpha\begin{bmatrix}1\\1\\1\end{bmatrix}
+\beta\begin{bmatrix}1\\2\\3\end{bmatrix}
$$
for some $\alpha$ and $\beta$.
Best Answer
Each of the equations describes a plane. Generally, three such equation describes the only point that belongs to those three planes. But since, in your case, the third equation adds no new information, what you have there describes that intersection of two distinct non-parallel planes, which is a line.