[Math] Let $K = \mathbb{Q}(\alpha)$ where $\alpha$ is a root of the polynomial $x^3 + 2x + 1$, and let $g(x) = x^3 + x + 1$. Does $g(x)$ have a root in $K$

abstract-algebraextension-fieldgalois-theorypolynomialssplitting-field

Problem: Let $K = \mathbb{Q}(\alpha)$ where $\alpha$ is a root of the polynomial $f(x) = x^3 + 2x + 1$, and let $g(x) = x^3 + x + 1$. Does $g(x)$ have a root in $K$?

My Attempt: I have proved that $f(x)$ and $g(x)$ are irreducible and found that their discriminates are not squares, so the Galois group is $S_3$ for both of them. Then it has to be that $[\mathbb{Q}(\alpha):\mathbb{Q}] = 2$ and $[\mathbb{Q}(\beta):\mathbb{Q}]$. So, $\alpha$ is a square root of something, but I am stuck here. I don't know what to do. The discriminant is not in the extension $\mathbb{Q}(\alpha)$, so we cannot work with it. And the formula for the roots of cubic polynomial gives alpha as a sum of two very complicated cubic roots.

Best Answer

The discriminant of $f$ is $-59$ and the discriminant of $g$ is $-31$. These two numbers are square-free, so they are actually the discriminants of the ring of integers of the corresponding number fields $K$ and $L=\mathbf Q[x]/g(x)$. Since $(31,59)=1$ it follows that $g$ has no root in $K$ (because $K$ ramifies only over the prime $59$, and $L$ ramifies only over the prime $31$).

Suggestion for a more elementary approach: pick an indeterminate element $y=u\alpha^2 + v\alpha + w \in K$, where $u,v,w \in \mathbf Q$. By calculating a linear dependence relation between $y^2, y$ and $1$, obtain a (not necessarily monic) polynomial $p(x) \in \mathbf Z[u,v,w][x]$ such that $p(y) = 0$. Then, calculate the resultant of $p$ and $g$ by writing it as the determinant of the appropriate Sylvester matrix. This will give you a polynomial $R\in \mathbf Z[u,v,w]$. If you can prove that $R(a,b,c) \neq 0$ for all $(a,b,c) \in \mathbf Q$, it will follow that $f$ has no root in $K$.

Related Question