I had come across a question in which involved finding polynomials (with real coefficients) satisfying the division criteria stated above. By inspection, it was easy to see that polynomials like $x$, $x^2$, $x^3$, etc. satisfied. So I went on to try a more general polynomial $f (x)=ax^n $ and it worked. I thought that if I'm able to prove that $f (x)$ can't have a non-zero root then it would suffice. Though I have found a solution that uses the 'assumption-contradiction' method (assuming that $f (x)$ has a non-zero root and showing a contradiction), but I was wondering that is there a technique that would actually allow us to solve the question (or questions of the same type) without guessing the answer first?
Find all polynomials $f (x)$ such that $f (x^2+x+1)$ divides $f (x^3-1)$
functional-equationspolynomials
Related Solutions
The following algorithm suggests itself. Let $\alpha$ be an unknown root of a polynomial $f(x)$ of degree $n$, and $\beta$ be an unknown root of a polynomial $g(x)$ of degree $m$.
Using the equation $f(\alpha)=0$ allows us to rewrite any power $\alpha^k$ as a linear combination of $1,\alpha,\alpha^2,\ldots,\alpha^{n-1}$. Similarly using the equation $g(\beta)=0$ allows us to rewrite any power $\beta^\ell$ as a linear combination of $1,\beta,\beta^2,\ldots,\beta^{m-1}$. Putting these two pieces together shows that we can write any monomial $\alpha^k\beta^\ell$ as a linear combination of the $mn$ quantities $\alpha^i\beta^j, 0\le i<n, 0\le j<m.$ Denote $c_k=\alpha^i\beta^j$, where $k=mi+j$ ranges from $0$ to $mn-1$.
Next let us use the expansions of $(\alpha\beta)^t$, $0\le t\le mn$ in terms of the $c_k$:s. Let these be $$ (\alpha\beta)^t=\sum_{k=0}^{mn-1}a_{kt}c_k. $$ Here the coefficients $a_{tk}$ are integers. We seek to find integers $x_t,t=0,1,\ldots,mn$, such that $$ \sum_{t=0}^{mn}x_t(\alpha\beta)^t=0.\qquad(*) $$ Let us substitute our formula for the power $(\alpha\beta)^t$ above. The equation $(*)$ becomes $$ 0=\sum_t\sum_kx_ta_{tk}c_k=\sum_k\left(\sum_t x_ta_{kt}\right)c_k. $$ This will be trivially true, if the coefficients of all $c_k$:s vanish, that is, is the equation $$ \sum_{t=0}^{mn}a_{kt}x_t=0 \qquad(**) $$ holds for all $k=0,1,2,\ldots,mn-1$. Here there are $mn$ linear homogeneous equations on the $mn+1$ unknowns $x_t$. Therefore linear algebra says that we are guaranteed to succeed in the sense that there exists a non-trivial vector $(x_0,x_1,\ldots,x_{mn})$ of rational numbers that is a solution of $(**)$. Furthermore, by multiplying with the least common multiple of the denominators, we can make all the $x_t$:s integers.
The polynomial $$ F(x)=\sum_{t=0}^{mn}x_tx^t $$ is then an answer.
Let's do your example of $f(x)=x^2-1$ and $g(x)=x^2+3x+2$. Here $f(\alpha)=0$ tells us that $\alpha^2=1$. Similarly $g(\beta)=0$ tells us that $\beta^2=-3\beta-2$. This is all we need from the polynomials. Here $c_0=1$, $c_1=\beta$, $c_2=\alpha$ and $c_3=\alpha\beta$. Let us write the power $(\alpha\beta)^t$, $t=0,1,2,3,4$ in terms of the $c_k$:s. $$ (\alpha\beta)^0=1=c_0. $$ $$ (\alpha\beta)^1=\alpha\beta=c_3. $$ $$ (\alpha\beta)^2=\alpha^2\beta^2=1\cdot(-3\beta-2)=-2c_0-3c_1. $$ $$ (\alpha\beta)^3=\alpha\beta(-3\beta-2)=\alpha(-3\beta^2-2\beta)=\alpha(9\beta+6-2\beta)=\alpha(7\beta+6)=6c_2+7c_3. $$ $$ (\alpha\beta)^4=(-3\beta-2)^2=9\beta^2+12\beta+4=(-27\beta-18)+12\beta+4=-14-15\beta=-14c_0-15c_1. $$ We are thus looking for solutions of the homogeneous linear system $$ \left(\begin{array}{ccccc} 1&0&-2&0&-14\\ 0&0&-3&0&-15\\ 0&0&0&6&0\\ 0&1&0&7&0 \end{array}\right) \left(\begin{array}{c}x_0\\x_1\\x_2\\x_3\\x_4\end{array}\right)=0. $$ Let us look for a solution with $x_4=1$ (if the polynomials you started with are monic, then this always works AFAICT). The third equation tells us that we should have $x_3=0$. The second equation allows us to solve that $x_2=-5$. The last equation and our knowledge of $x_3$ tells that $x_1=0$. The first equation then tells that $x_0=4.$ This gives us the output $$ x^4-5x^2+4. $$ The possibilities for $\alpha$ are $\pm1$ and the possibilities for $\beta$ are $-1$ and $-2$. We can easily check that all the possible four products $\pm1$ and $\pm2$ are zeros of this quartic polynomial.
My solution to the linear system was ad hoc, but there are known algorithms for that: elimination, an explicit solution in terms of minors,...
- The rational roots or rational zeros theorem only applies to polynomials that have integer coefficients - although a polynomial with rational coefficients can be turned into a polynomial with integer coefficients by multiplying through by the LCM of the coefficients' denominators. The simplest way to see why it is true is to notice that if $\frac{p}{q}$ is a root of the polynomial then $qx-p$ is a factor of the polynomial and so $q$ must be a factor of the coefficient of the highest power term and $p$ must be a factor of the constant term.
- There is a simple proof of Descartes rule of signs here: https://www.math.hmc.edu/funfacts/ffiles/20001.1.shtml. It uses induction of the degree of the polynomial, and compares the sign changes in a polynomial with the sign changes in its derivative.
- There are explicit formulae for finding the roots of cubic and quartic polynomials (similar to the quadratic formula but more complicated). Finding complex roots of higher degree polynomials is difficult, and usually requires an iterative algorithm such as Newton's method to approximate the roots.
- I'm not sure I understand this question. To test whether $-4$ is a root of a polynomial you can just substitiute $x=-4$ to see whether $f(-4)=0$. If you know $-4$ is a root of the polynomial then you can find the quotient after you have removed that root by dividing the polynomial by $x+4$. You know $x+4$ know must be a factor of the polynomial because $f(-4)=0$.
Best Answer
We will prove that all solutions are of a form $ax^n$, where $n\in \mathbb{N}_0$ and $a\in \mathbb{R}$.
Say exsist $a_1\ne 0$ such that $f(a_1)=0$. Then there exsist $x_1\in \mathbb{C}$ such that $x_1^2+x_1+1=a_1$ and $|x_1-1|>1$.
But then we have $$f(x_1^3-1) = k(x_1)f(a_1)=0$$ so $$a_2= x_1^3-1$$ is another root for $f$ and we can procede like this to get $a_3, a_4,...$. Since $$|a_2| = |x_1-1||a_1|>|a_1|$$ no two member of this sequence are equal.So we have infinite number of roots for $f$ which is a contradiction.