[Math] An algebraic number is not a root of unity

galois-theorynt.number-theorypolynomialsroots-of-unity

This problem is related to my study of the Burau representation of the braid group $B_3$: I was trying to show that certain "congruence subgroups" are of infinite index.

There is an approach that boils it down to the following question:

Let $\xi$ be a primitive root of unity of degree $2n>12$ (just in case). Then the roots $\lambda$ of $\chi(\lambda):=\lambda^2+(\xi^2-\xi+1)\lambda+\xi^2$ are not roots of unity.

I was particularly interested in the cases $n=7$ or $9$. In both cases, $\deg\Phi_{2n}=6$ (here $\Phi$ is the cyclotomic polynomial) and, hence, $\mathbb{Q}[\lambda]$ has degree at most $12$. There are finitely many (although quite a few) cyclotomic polynomials $\Phi_k$ with $\deg\Phi_k\le12$. For each such polynomial, one computes the $\lambda$-resultant $R(\xi)$ of $\Phi_k$ and $\chi$ and checks that $R(\xi)\ne0\bmod\Phi_{2n}(\xi)$.

Clearly, this approach should work for any given $n$, but I have no idea how this can be done "in general". So, here is the question:

Is there a smarter way to prove that an algebraic number is not a root of unity?

Best Answer

Assume that the roots $\zeta$ and $\xi^2/\zeta$ of your equation are roots of unity. Then by Vieta Theorem $\zeta+\xi^2/\zeta=-(\xi^2-\xi+1)$; equivalently sum of 5 roots of unity $\zeta+\xi^2/\zeta+\xi^2-\xi+1$ vanishes. All such relations are classified in Theorem 6 of paper by Conway and Jones. This implies the result.

Related Question