For your example, you have a "nearly" linear system, because only one equation is quadratic. In that case, you can express all solutions of the linear part of the system in the form $x_0+\alpha x_h$, substitute that expression into the quadratic equation and solve the resulting equation for $\alpha$.
If you apply this technique to your system, you get $v_e=v_a-\alpha$, $v_s=\frac{m_p}{m_s}\alpha$ and $\omega=\frac{lm_p}{2I}\alpha$. If you substitute this into the quadratic equation
$$m_p (v_a^2-v_e^2) = m_s v_s^2 + I \omega^2$$
you get
$$m_p (2v_a-\alpha)\alpha = \frac{m_p^2}{m_s}\alpha^2 + \frac{l^2m_p^2}{4I}\alpha^2$$
One solution of this quadratic equation is obviously $\alpha=0$, but I guess you are more interested in the other solution. Assuming $m_p\alpha\neq0$, we can divide by $m_p\alpha$ to get
$2v_a-\alpha = \frac{m_p}{m_s}\alpha + \frac{l^2m_p}{4I}\alpha$. It's easy to solve this equation for $\alpha$.
That's one of the main reasons why linear algebra was invented!
First we translate the problem into matrices: if
$$
\mathbf{A}=\begin{bmatrix}
1 & 1 & 1 \\
1 & 1 & 2 \\
1 & 1 & 3
\end{bmatrix}
\qquad
\mathbf{x}=\begin{bmatrix} x \\ y \\ z \end{bmatrix}
\qquad
\mathbf{b}=\begin{bmatrix} 1 \\ 3 \\ -1 \end{bmatrix}
$$
then the system can be rewritten as $\mathbf{A}\mathbf{x}=\mathbf{b}$. This is not really a great simplification, but allows using the unknowns as a “single object”.
A big advance is obtained by interpreting this in terms of linear maps. The matrix $\mathbf{A}$ induces a linear map $f_{\mathbf{A}}\colon\mathbb{R}^3\to\mathbb{R}^3$ defined by
$$
f_{\mathbf{A}}(\mathbf{v})=\mathbf{A}\mathbf{v}
$$
and now solvability of the linear system becomes the question
does the vector $\mathbf{b}$ belong to the image of $f_{\mathbf{A}}$?
The image $\operatorname{Im}(f_{\mathbf{A}})$ is a vector subspace of $\mathbb{R}^3$; if it has dimension $3$, then clearly the system is solvable. But what if the dimension is less than $3$?
This is the “obstruction” for the solvability: when the dimension of the image (the rank of the linear map and of the matrix $\mathbf{A}$) is less than the dimension of the codomain (in your case $3$) the system can be solvable or not, depending on whether $\mathbf{b}$ belongs to the image or not.
There is no “general answer” that allows just looking at $\mathbf{A}$ and $\mathbf{b}$ and tell whether the system is solvable. Rather, there are efficient techniques that show whether the system has a solution without actually solving it. A very good one is doing elementary row operations, because these correspond to multiplying both sides of the system by an invertible matrix. In the present case, we do
\begin{align}
\left[\begin{array}{ccc|c}
1 & 1 & 1 & 1 \\
1 & 1 & 2 & 3\\
1 & 1 & 3 & -1
\end{array}\right]
&\to
\left[\begin{array}{ccc|c}
1 & 1 & 1 & 1 \\
0 & 0 & 1 & 2\\
0 & 0 & 2 & -2
\end{array}\right]
&&\begin{aligned} R_2&\gets R_2-R_1 \\ R_3&\gets R_3-R_1 \end{aligned}
\\&\to
\left[\begin{array}{ccc|c}
1 & 1 & 1 & 1 \\
0 & 0 & 1 & 2\\
0 & 0 & 0 & -6
\end{array}\right]
&&R_3\gets R_3-2R_2
\end{align}
At this stage we know that the system is not solvable. We also know that the rank of $\mathbf{A}$ is $2$ and even that the image is spanned by the vectors
$$
\begin{bmatrix}1\\1\\1\end{bmatrix}
\qquad
\begin{bmatrix}1\\2\\3\end{bmatrix}
$$
This is easy for the present situation, but the method can be applied to systems of any size, not necessarily with as many equations as unknowns.
The same row elimination shows that if the vector $\mathbf{b}$ had been
\begin{bmatrix} 1 \\ 3 \\ 5 \end{bmatrix}
then the system would be solvable.
Seen in a different way, the system is solvable if and only if
$$
\mathbf{b}=\alpha\begin{bmatrix}1\\1\\1\end{bmatrix}
+\beta\begin{bmatrix}1\\2\\3\end{bmatrix}
$$
for some $\alpha$ and $\beta$.
Best Answer
You can get your answer using simple elementary row operations:
$$\left( \begin{matrix} 1 & 2 & 3 & -3 & | & a\\ 2 & -5 & -3 & 12 & | & b \\ 7 & 1 & 8 & 5 & | & c \\ \end{matrix} \right) >>{\text{$R2=R2-2R1, R3=R3-7R1$}}>>$$ $$\left( \begin{matrix} 1 & 2 & 3 & -3 & | & a\\ 0 & -9 & -9 & 18 & | & b-2a \\ 0 & -13 & -13 & 26 & | & c-7a \\ \end{matrix} \right)>>{\text{$R3=R3+(- \frac{13} {9} R2)$}}>>$$
$$\left( \begin{matrix} 1 & 2 & 3 & -3 & | & a\\ 0 & -9 & -9 & 18 & | & b-2a \\ 0 & 0 & 0 & 0 & | & c-7a + [- \frac{13} {9}(b-2a)]\\ \end{matrix} \right)$$
Now, notice that the last row of the matrix is a zero row, thus the system has a solution only if the right side of the equation is equal to zero. thus-
$$c-7a + [- \frac{13} {9}(b-2a)] =0$$ $$-\frac{37}{9}a-\frac{13}{9}b+c=0$$ $$-37a -13b +9c=0$$ $$37a+13b=9c$$