Algebra Precalculus – Sum of Sixth Power of Roots of x^3-x-1=0

algebra-precalculuspolynomialsproof-verification

Question:
Find the sum of sixth power of roots of the equation $x^3-x-1=0$

My First approach:

Let $S_i$ denote the sum of $i^{th}$ power of roots of the given equation.

Now, multiplying given equation by $x^3$ , putting each of the three roots and adding the three formed equations

we get, $S_6=S_4+S_3$

repeating same above procedure to obtain,

$S_4=S_2+S_1$ , $S_3=S_1+1$

hence, $S_6=S_2+2S_1+1=2+0+1=3$

My Second Approach:

Let $a,b,c$ be the roots of $f(x)=x^3-x-1$

Clearly, $f^/(x)/f(x)=1/(x-a)+1/(x-b)
+1/(x-c)$

=$\sum(1/x+a/x^2+a^2/x^3+a^3/x^4+\cdots)=3/x+S_1/x^2+S_2/x^3+\cdots$

hence we get, $S_6=5$

$\rule{200px}{0.5px}$

I am getting wrong answer through First Approach, Please point out my mistake or post a better solution.

Thank You

Best Answer

$\textbf{Warning}$: The following approach is not the most efficient way of going about this—indeed, I probably wouldn't do this myself if I actually needed the answer. However, it has pedagogical value and presents a rare opportunity for me to share a cool bit of math. Without further adieu...


Let $\{\alpha_k\}_{k=1}^3$ be the roots of $f(x) = x^3 - x - 1$. Notice that $P(\alpha_1, \alpha_2, \alpha_3) = \alpha_1^6 + \alpha_2^6 + \alpha_3^6$ is a symmetric polynomial* in $3$ variables evaluated at the $\alpha_k$'s. It is a theorem** that any symmetric polynomial can be written as a polynomial in the elementary symmetric polynomials.

Let $\big\{e_k(x_1,x_2,x_3) \big\}_{k=1}^3$ be the collection of elementary symmetric polynomials in three variables. The above is saying that there exists a polynomial $Q \in \mathbb{Z}[x_1, x_2, x_3]$ such that $P(x_1, x_2, x_3) = Q\Big(e_1(x_1,x_2,x_3), \ e_2(x_1,x_2,x_3), \ e_3(x_1,x_2,x_3)\Big)$.

Now this looks like a mess, but the the elementary symmetric polynomials—evaluated at the $\alpha_k$'s—appear$^\dagger$ as the (signed) coefficients of $f$! Thus, if we can find the polynomial $Q$ above, then we can easily find the desired value $P(\alpha_1, \alpha_2, \alpha_3)$ because the right hand side of this evaluation will consist merely of the addition and multiplication of simple, already-known integers (see how robjohn got a clean natural number in his post?). To be explicit, we'll have: $$\begin{align} & e_1(\alpha_1, \alpha_2, \alpha_3) \ = \ 0 \qquad \ \ \text{ (the } x^2 \text{ coefficient)}\\ & e_2(\alpha_1, \alpha_2, \alpha_3) \ = \ -1 \\ & e_3(\alpha_1, \alpha_2, \alpha_3) \ = \ 1 \end{align}$$

Finding $Q$ may or may not be computationally difficult! It appears that Newton did find a recursion which writes symmetric polynomials of the form $\displaystyle \sum_{k=1}^n x_k^j$ for any $n, j \in \mathbb{N}$ in terms of elementary symmetric polynomials (of course, in this scenario, we have $n=3$ and $j=6$). See here for details.


$$\underline{\textbf{Footnotes}} \\[0.5em]$$

$\text{*}$The adjective "symmetric" here alludes to the fact that these polynomials do not change if you swap around ("permute") the variables. Indeed, $x_1^6\!+\! x_2^6 \!+\! x_3^6 = x_3^6 \!+ \!x_1^6 \!+ \!x_2^6$, etc. On the other hand, $f(x_1, x_2) = x_1x_2^2$ would not be a symmetric polynomial as $x_1x_2^2 \neq x_2x_1^2$.


$\text{**}$Called the "fundamental theorem of symmetric polynomials", this was originally discovered by Isaac Newton. See this thread for further discussion and links to several proofs.


$\mathbf{^\dagger}$For any $n \in \mathbb{N}$, a collection of $n\!+\!1$ elementary symmetric polynomials in $n$ variables is defined. To determine this collection, first start with a generic, completely-factored, $n^\text{th}$-degree monic polynomial $\displaystyle p(x) = \prod_{k=1}^n (x - \alpha_k)$. After expanding and grouping terms, we'll get $p(x) = e_0x^n - e_1x^{n-1} + e_2x^{n-2} - \cdots + (-1)^ne_n$. Each coefficient $e_k$ is the $k^\text{th}$ elementary symmetric polynomial in $n$ variables—a polynomial in $\mathbb{Z}[x_1, x_2, \cdots, x_n]$—which has been evaluated at the roots $x_k = \alpha_k$. For example, let's look at the $n=2$ case: $$p(x) \ = \ (x-\alpha_1)(x-\alpha_2) \ = \ x^2 - (\alpha_1 + \alpha_2)x + \alpha_1\alpha_2$$

This reveals that, in two variables, $e_0(x_1, x_2) = 1$ and $e_1(x_1, x_2) = x_1 + x_2$ and $e_2(x_1,x_2) = x_1x_2$.

In generality, we'll have: $$\begin{align} & e_0(x_1, \cdots, x_n) \ = \ 1 \\[0.5em] & e_1(x_1, \cdots, x_n) \ = \ \sum_{k=1}^n x_k \\[0.5em] & e_2(x_1, \cdots, x_n) \ = \ \sum_{1 \leq k \leq j \leq n} x_kx_j \\[0.5em] & e_3(x_1, \cdots, x_n) \ = \ \sum_{1 \leq k \leq j \leq m \leq n} x_kx_jx_m \\[0.5em] & \ \ \vdots \qquad \text{ this pattern continues until: } \\[0.5em] & e_n(x_1, \cdots, x_n) \ = \ \prod_{k=1}^n x_k \end{align}$$

Related Question