Rational expressibility of polynomials which are symmetric under the permutation of all variables except one fixed variable

abstract-algebragalois-theorygroup-theorypolynomialssymmetric-polynomials

I am trying to prove the following statement of early group theory / classic galois theory:

Let $\sigma_1,\ldots,\sigma_n$ be the elementary symmetric polynomials
of $x_1,\ldots,x_n$.

If $f$ is a rational function of $x_1,\ldots,x_n$ that is symmetric
under all permutations of the $x_i$’s that fix $x_1$, then it is
expressible as a rational function of $\sigma_1,\ldots,\sigma_n$ and
$x_1$.

The theorem is stated in the paper The fundamental theorem on symmetric polynomials: History’s first whiff of galois theory as theorem 7. But no proof is given.

In the paper Galois for 21st-Century Readers, footnote 4, I found the hint that every symmetric polynomial in the roots $x_2,\ldots,x_n$ can be expressed rationally in terms of $x_1$.

This is explained by showing that the elementary symmetric polynomials of $x_2,\ldots,x_n$ can be written as polynomials of $x_1$. However I do not completely understand, why this is the case.

I get that if you multiply out the left side of the equation
$$(x-x_2)\cdot\ldots\cdot(x-x_n)=\frac{f(x)}{(x-x_1)}$$
you get a polynomial where the coefficients are exactly the elementary symmetric polynomials of $x_2,\ldots,x_n$. But I do not understand why the coefficients of the polynomial on the right side can be expressed as polynomials of $x_1$.

What does the polynomial reminder theorem have to do with this? Yes, $(x-x_1)$ divides $f(x)$ without reminder. $f(x_1)=0$. So what?

Best Answer

Let $\tau_1, \tau_2,\dots,\tau_{n-1}$ be the elementary symmetric polynomials in $x_2,x_3,\dots,x_n$. It is straightforward to show that $\sigma_k=\tau_k + x_1 \cdot \tau_{k-1}$ for $k=1,2,\dots,n-1$ (with $\tau_0=1\,$).

It follows that each $\tau_k$ can be written as a polynomial $t_k$ in $x_1$ and $\sigma_1, \sigma_2,\dots,\sigma_{n-1}$:

$$ \begin{align} \tau_1 &= \sigma_1 - x_1\,\tau_0 = \sigma_1 - x_1 = t_1(x_1, \sigma_1) \\ \tau_2 &= \sigma_2 - x_1\,\tau_1= \sigma_2 - x_1\,t_1(x_1,\sigma_1) = t_2(x_1, \sigma_1, \sigma_2) \\ \dots \\ \tau_{n-1} &= \sigma_{n-1} - x_1 \, \tau_{n-2} = \sigma_{n-1} - x_1 \, t_{n-2}(x_1, \sigma_1,\dots,\sigma_{n-2}) = t_{n-1}(x_1, \sigma_1,\dots,\sigma_{n-1}) \end{align} $$

A polynomial that is symmetric under permutations of the $x_i$’s that fix $x_1$ can then be written as:

$$ \begin{align} f(x) &= \sum_{j=0}^n \;p_j(\tau_1, \tau_2, \dots,\tau_{n-1})\,x_1^j \\ &= \sum_{j=0}^n \;p_j\left(t_1(x_1, \sigma_1), t_2(x_1, \sigma_1,\sigma_2), \dots,t_{n-1}(x_1, \sigma_1,\sigma_2,\dots,\sigma_{n-1})\right)\,x_1^j \\ &= p(x_1, \sigma_1,\sigma_2,\dots,\sigma_{n-1}) \end{align} $$

This proves the proposition for polynomials, and the extension to rational functions follows.