[Math] Checking if a System of Polynomial Equations is Consistent

algebraic-geometrypolynomialssystems of equations

I'm trying to determine whether any solutions exist to a system of $(n+1)$ polynomial equations in $n$ unknowns.
For example:
$$
\begin{align*}
xy&=-2\\
x^2-1&=0\\
y^3-3y^2+2y&=0
\end{align*}
$$
This is an accurate simplification of my actual system in that the first equation contains mixed terms with all variables represented and the remaining $n$ equations are univariate, and that the maximum degree of any term (say, $D$) is in the low single digits.
The actual system is very large ($n$ is $10^3$ to $10^4$), but with similar $D$.
It also typically has many terms in the first equation.

In this example, the system is in fact consistent ($x=-1; y=2$).
Were the first equation altered, however, to $xy=-3$, the system would be inconsistent.
My naive approach is to find the solutions to the last $n$ equations (here $x\in\{-1,1\},y\in\{0,1,2\}$) and try all combinations thereof until one satisfies the first equation or I exhaust the combinations, but this requires roughly $D^n$ checks, which isn't feasible for large systems.
It's also not necessary to find a solution, just prove whether one exists.

I've spent some time looking for appropriate algorithms, and I'll summarise my current understanding. I don't have a very strong background in this kind of math, so please correct any misconceptions I have here.

  • As per Wikipedia, consistency can be checked via computation of the Grobner basis.
    It seems like I would need to know a priori the order $y>x$ to compute the basis, which I do not.
    Also, one of the papers I read on the topic seems to indicate that this method gets exponentially slower with increasing $n$.

  • There is a numerical method called `homotopy continuation' for finding solutions given a similar set of equations with known solutions, but since the number of solutions is unknown (which is the point after all) I couldn't see how/if I could apply it. It involves varying a parameter $t$ from 0 to 1 and formulating the equations such that at $t=0$ you recover the known case and at $t=1$ you recover the desired case, using Newton's method. My attempt failed because it is possible that no solutions exist for some intermediate $t$ even though the system is consistent at both $t=0$ and $t=1$.

  • It is possible to state the equations as rational functions in a single variable using a technique called `rational univariate representation'. I don't properly understand how to get this representation of the system other than by trial and error, although I do understand how it would allow me to determine the consistency of the system if I did find it.

I believe that due to the form of the system, I won't ever have infinite solutions, but I may have more than one.

I can accept having to use methods that are exponential in $D$ due to it never getting large, but not in $n$.
If a suitable algorithm to check consistency relies upon the assumption of integer solutions, I would consider that an acceptable drawback.
Can any of the above methods serve, or how else might I go about this?
Thank you very much for your patience in reading this, and for any assistance you can provide.

Best Answer

I'm not sure you can hope for a fast algorithm in general. Suppose the first equation has the form

$$a_1x_1+a_2x_2+\cdots+a_nx_n=0$$

where $a_1,a_2,\ldots,a_n$ are positive integers, and the other $n$ equations are all of the form

$$x_k^2-1=0$$

Then what you're checking for is solutions of the partition problem for the set $S=\{a_1,a_2,\ldots,a_n\}$, which is NP-complete (although the Wikipedia article indicates it's an "easy" hard problem that can often be solved in practice via dynamic programming).

Related Question