The following algorithm suggests itself. Let $\alpha$ be an unknown root of a polynomial $f(x)$ of degree $n$, and $\beta$ be an unknown root of a polynomial $g(x)$ of degree $m$.
Using the equation $f(\alpha)=0$ allows us to rewrite any power $\alpha^k$ as
a linear combination of $1,\alpha,\alpha^2,\ldots,\alpha^{n-1}$. Similarly using the equation $g(\beta)=0$ allows us to rewrite any power $\beta^\ell$ as a linear
combination of $1,\beta,\beta^2,\ldots,\beta^{m-1}$. Putting these two pieces together shows that we can write any monomial $\alpha^k\beta^\ell$ as a linear combination of the $mn$ quantities $\alpha^i\beta^j, 0\le i<n, 0\le j<m.$
Denote $c_k=\alpha^i\beta^j$, where $k=mi+j$ ranges from $0$ to $mn-1$.
Next let us use the expansions of $(\alpha\beta)^t$, $0\le t\le mn$ in terms of the $c_k$:s. Let these be
$$
(\alpha\beta)^t=\sum_{k=0}^{mn-1}a_{kt}c_k.
$$
Here the coefficients $a_{tk}$ are integers. We seek to find integers $x_t,t=0,1,\ldots,mn$, such that
$$
\sum_{t=0}^{mn}x_t(\alpha\beta)^t=0.\qquad(*)
$$
Let us substitute our formula for the power $(\alpha\beta)^t$ above. The equation $(*)$ becomes
$$
0=\sum_t\sum_kx_ta_{tk}c_k=\sum_k\left(\sum_t x_ta_{kt}\right)c_k.
$$
This will be trivially true, if the coefficients of all $c_k$:s vanish, that is, is the equation
$$
\sum_{t=0}^{mn}a_{kt}x_t=0 \qquad(**)
$$
holds for all $k=0,1,2,\ldots,mn-1$. Here there are $mn$ linear homogeneous equations on the $mn+1$ unknowns $x_t$. Therefore linear algebra says that we are guaranteed to succeed in the sense that there exists a non-trivial vector
$(x_0,x_1,\ldots,x_{mn})$ of rational numbers that is a solution of $(**)$. Furthermore, by multiplying with the least common multiple of the denominators,
we can make all the $x_t$:s integers.
The polynomial
$$
F(x)=\sum_{t=0}^{mn}x_tx^t
$$
is then an answer.
Let's do your example of $f(x)=x^2-1$ and $g(x)=x^2+3x+2$. Here $f(\alpha)=0$
tells us that $\alpha^2=1$. Similarly $g(\beta)=0$ tells us that $\beta^2=-3\beta-2$. This is all we need from the polynomials. Here $c_0=1$, $c_1=\beta$,
$c_2=\alpha$ and $c_3=\alpha\beta$. Let us write the power $(\alpha\beta)^t$,
$t=0,1,2,3,4$ in terms of the $c_k$:s.
$$
(\alpha\beta)^0=1=c_0.
$$
$$
(\alpha\beta)^1=\alpha\beta=c_3.
$$
$$
(\alpha\beta)^2=\alpha^2\beta^2=1\cdot(-3\beta-2)=-2c_0-3c_1.
$$
$$
(\alpha\beta)^3=\alpha\beta(-3\beta-2)=\alpha(-3\beta^2-2\beta)=\alpha(9\beta+6-2\beta)=\alpha(7\beta+6)=6c_2+7c_3.
$$
$$
(\alpha\beta)^4=(-3\beta-2)^2=9\beta^2+12\beta+4=(-27\beta-18)+12\beta+4=-14-15\beta=-14c_0-15c_1.
$$
We are thus looking for solutions of the homogeneous linear system
$$
\left(\begin{array}{ccccc}
1&0&-2&0&-14\\
0&0&-3&0&-15\\
0&0&0&6&0\\
0&1&0&7&0
\end{array}\right)
\left(\begin{array}{c}x_0\\x_1\\x_2\\x_3\\x_4\end{array}\right)=0.
$$
Let us look for a solution with $x_4=1$ (if the polynomials you started with
are monic, then this always works AFAICT). The third equation tells us that we should have $x_3=0$. The second equation allows us to solve that $x_2=-5$. The last equation and our knowledge of $x_3$ tells that $x_1=0$. The first equation
then tells that $x_0=4.$ This gives us the output
$$
x^4-5x^2+4.
$$
The possibilities for $\alpha$ are $\pm1$ and the possibilities for $\beta$ are $-1$ and $-2$. We can easily check that all the possible four products $\pm1$ and $\pm2$ are zeros of this quartic polynomial.
My solution to the linear system was ad hoc, but there are known algorithms for that: elimination, an explicit solution in terms of minors,...
If $f$ is a degree $n$ polynomial then $f'$ is a degree $n - 1$ polynomial, and has at most $n - 1$ roots. That means that there can be at most $n - 1$ local maxima and minima of the function $f$. Likewise, this caps the number of changes in concavity.
This really strongly constrains the ripply behavior that you're talking about.
Best Answer
Note: The question has been changed. The following answers the new question: "Why do local extremas of a differentiable function $f(x)$ occur precisely where $f'(x)=0$?"
Suppose we have a differentiable function $f(x)$, and further suppose that $f(x)$ has a local max (or local min) at $x=c$. Then it must be that the slope of the tangent line to $f(x)$ at $x=c$ is $0$; that is, $f(x)$ has a horizontal tangent at $x=c$.
The derivative $f'(x)$ tells us the slope of the tangent line to $f(x)$ at any $x$-value we want. Therefore, it must be that $f'(c)=0$, since the we know that $f(x)$ has a tangent slope of $0$ at $x=c$.
In summary, if $f(x)$ has a local extrema at $x=c$, then $f'(c)=0$.
(However, if $f'(c)=0$, this does not necessarily mean that $f(x)$ has a local extrema at $x=c$; for example, $f(x)=x^3$ at $x=0$)