[Math] Express symmetric polynomial $\prod_{i < j} (X_i+X_j)$ in terms of elementary symmetric functions

abstract-algebrapolynomialssymmetric-polynomials

Exercise: Define a polynomial $\Sigma(X_1,\ldots,X_n)$ as
\begin{align*}
\Sigma(X_1,\ldots,X_n) = \prod_{i < j} (X_i+X_j)
\end{align*}
This is a symmetric polynomial, quite clearly. I want to express $\Sigma$ as a polynomial in the elementary symmetric functions, for $n=2,3,4$. Case $n=2$ is trivial: $\Sigma(X_1,X_2)= \sigma_1 = X_1+X_2$. Case $n=3$ is easy. I write $\Sigma$ as follows:
\begin{align*}
\Sigma(X_1,X_2,X_3) = (\sigma_1 – X_1)(\sigma_1 – X_2)(\sigma_1 – X_3),
\end{align*}
then apply Vieta's formulas formally, obtaining
\begin{align*}
\Sigma(X_1,X_2,X_3) = \sigma_1^3 – \sigma_1 \sigma_1^2 + \sigma_2 \sigma_1 – \sigma_3 = \sigma_2 \sigma_1 – \sigma_3.
\end{align*}
The case $n=4$ seems harder to me, and I'm unable to work it out. I hope there is some trick to avoid unnecessary computations. My attempt is to write down $\Sigma$ completely:
\begin{align*}
\Sigma = (X_1+X_2)(X_3+X_4)(X_1+X_4)(X_2+X_3)(X_2+X_4)(X_1+X_3)
\end{align*}
and then multiply factors in groups of two, obtaining:
\begin{align*}
\Sigma = (\sigma_2 – X_1X_2-X_3X_4)(\sigma_2 – X_1X_4 – X_2X_3)(\sigma_2 – X_2X_4-X_1X_3),
\end{align*}
but it doesn't seem to help much. How can I solve the exercise?

Best Answer

Here is an answer for any number $n$ of variables.

For any partition $\lambda$, we let $s_{\lambda}$ denote the Schur function corresponding to $\lambda$. Let $\delta$ be the partition $\left( n-1,n-2,\ldots,1\right) $ of $n\left( n-1\right) /2$. Then, \begin{equation} \prod_{1\leq i<j\leq n}\left( x_{i}+x_{j}\right) = s_{\delta}\left( x_{1},x_{2},\ldots,x_{n}\right) . \label{1} \tag{1} \end{equation}

This well-known formula (which, I think, is due to Jacobi) can easily be derived from the definition of Schur polynomials through alternants. Let me explain:

For every $n$-tuple $\alpha=\left( \alpha_{1},\alpha_{2},\ldots,\alpha _{n}\right) \in\mathbb{N}^{n}$ of nonnegative integers, we define the alternant $a_{\alpha}$ to be the polynomial \begin{equation} \sum_{\sigma\in S_{n}}\left( -1\right) ^{\sigma}x_{1}^{\alpha_{\sigma\left( 1\right) }}x_{2}^{\alpha_{\sigma\left( 2\right) }}\cdots x_{n}^{\alpha_{\sigma\left( n\right) }} = \det\left( \left( x_i^{\alpha_j}\right) _{1\leq i\leq n,\ 1\leq j\leq n}\right) \end{equation} in $\mathbb{Z}\left[ x_{1},x_{2},\ldots,x_{n}\right] $. If $\alpha=\left( \alpha_{1},\alpha_{2},\ldots,\alpha_{n}\right) $ and $\beta=\left( \beta _{1},\beta_{2},\ldots,\beta_{n}\right) $ are two $n$-tuples in $\mathbb{N} ^{n}$, then the $n$-tuple $\alpha+\beta$ is defined to be $\left( \alpha _{1}+\beta_{1},\alpha_{2}+\beta_{2},\ldots,\alpha_{n}+\beta_{n}\right) $.

Any partition $\lambda=\left( \lambda_{1},\lambda_{2},\ldots,\lambda _{k}\right) $ of length $k\leq n$ will be identified with the $n$-tuple $\left( \lambda_{1},\lambda_{2},\ldots,\lambda_{k},\underbrace{0,0,\ldots ,0}_{n-k\text{ times}}\right) \in\mathbb{N}^{n}$. Notice that $\delta$ is thus identified with the $n$-tuple $\left( n-1,n-2,\ldots,1,0\right) $.

Then, the "alternant formula" for the Schur polynomial $s_{\lambda}\left( x_{1},x_{2},\ldots,x_{n}\right) $ says that \begin{equation} s_{\lambda}\left( x_{1},x_{2},\ldots,x_{n}\right) =\dfrac{a_{\lambda+\delta }}{a_{\delta}} \label{2} \tag{2} \end{equation} whenever $\lambda$ is a partition of length $\leq n$. This is the historically first definition of Schur polynomials, long before the modern definitions via Young tableaux or the Jacobi-Trudi formulas were discovered; its main disadvantages are that it only defines $s_{\lambda}$ in finitely many variables and that it requires proof that $\dfrac{a_{\lambda+\delta} }{a_{\delta}}$ is indeed a polynomial. But this is fine for us. (For a proof of the fact that this definition of $s_\lambda$ is equivalent to the modern combinatorial definition, you can consult Corollary 2.6.6 in Darij Grinberg and Victor Reiner, Hopf algebras in combinatorics, arXiv:1409.8356v5. In the same chapter you will find an exercise proving the Jacobi-Trudi identity, which is yet another popular definition of the Schur polynomials.)

We can now prove \eqref{1}. Indeed, $\delta=\left( n-1,n-2,\ldots,0\right) \in\mathbb{N}^{n}$, so that the definition of $a_{\delta}$ yields \begin{equation} a_{\delta+\delta}=\det\left( \left( x_{i}^{n-j}\right) _{1\leq i\leq n,\ 1\leq j\leq n}\right) =\prod_{1\leq i<j\leq n}\left( x_{i}-x_{j}\right) \label{3} \tag{3} \end{equation} (by the Vandermonde determinant formula). But $\delta+\delta=\left( 2\left( n-1\right) ,2\left( n-2\right) ,\ldots,2\cdot0\right) $, so that \begin{align} a_{\delta+\delta} & =\det\left( \left( x_{i}^{2\left( n-j\right) }\right) _{1\leq i\leq n,\ 1\leq j\leq n}\right) =\det\left( \left( \left( x_{i}^{2}\right) ^{n-j}\right) _{1\leq i\leq n,\ 1\leq j\leq n}\right) \nonumber\\ & =\prod_{1\leq i<j\leq n}\left( x_{i}^{2}-x_{j}^{2}\right) \label{4} \tag{4} \end{align} (again by the Vandermonde determinant formula). Dividing \eqref{4} by \eqref{3}, we obtain \begin{equation} \dfrac{a_{\delta+\delta}}{a_{\delta}}=\dfrac{\prod_{1\leq i<j\leq n}\left( x_{i}^{2}-x_{j}^{2}\right) }{\prod_{1\leq i<j\leq n}\left( x_{i} -x_{j}\right) }=\prod_{1\leq i<j\leq n}\underbrace{\dfrac{x_{i}^{2}-x_{j} ^{2}}{x_{i}-x_{j}}}_{=x_{i}+x_{j}}=\prod_{1\leq i<j\leq n}\left( x_{i} +x_{j}\right) . \end{equation} But applying \eqref{2} to $\lambda=\delta$, we obtain $s_{\delta}\left( x_{1},x_{2},\ldots,x_{n}\right) =\dfrac{a_{\delta+\delta}}{a_{\delta}} =\prod_{1\leq i<j\leq n}\left( x_{i}+x_{j}\right) $. Thus, \eqref{1} is proven.

Finally, we can transform \eqref{1} into an explicit (if you allow determinants) formula for $\prod_{1\leq i<j\leq n}\left( x_{i}+x_{j}\right) $ in terms of the elementary symmetric polynomials. To that aim, we let $e_{k}\left( \overrightarrow{x}\right) $ denote the $k$-th elementary symmetric polynomial in $x_{1},x_{2},\ldots,x_{n}$. Notice that $e_{k}\left( \overrightarrow{x}\right) =0$ whenever $k<0$ and also whenever $k>n$.

Recall that one of the two Jacobi-Trudi formulas says that if $\lambda$ is a partition of length $\leq n$, then \begin{equation} s_{\lambda^{t}}\left( x_{1},x_{2},\ldots,x_{n}\right) =\det\left( \left( e_{\lambda_{i}-i+j}\left( \overrightarrow{x}\right) \right) _{1\leq i\leq n,\ 1\leq j\leq n}\right) , \end{equation} where $\lambda^{t}$ denotes the conjugate partition of $\lambda$ (this is the partition whose $i$-th entry is the number of entries of $\lambda$ that are $\geq i$, for each $i\in\left\{ 1,2,3,\ldots\right\} $). Applying this to $\lambda=\delta$, we obtain \begin{align*} s_{\delta^{t}}\left( x_{1},x_{2},\ldots,x_{n}\right) & =\det\left( \left( e_{\left( n-i\right) -i+j}\left( \overrightarrow{x}\right) \right) _{1\leq i\leq n,\ 1\leq j\leq n}\right) \\ & =\det\left( \left( e_{n-2i+j}\left( \overrightarrow{x}\right) \right) _{1\leq i\leq n,\ 1\leq j\leq n}\right) \\ & =\det\left( \begin{array} [c]{cccc} e_{n-1}\left( \overrightarrow{x}\right) & e_{n}\left( \overrightarrow{x} \right) & \cdots & e_{2n-2}\left( \overrightarrow{x}\right) \\ e_{n-3}\left( \overrightarrow{x}\right) & e_{n-2}\left( \overrightarrow{x} \right) & \cdots & e_{2n-4}\left( \overrightarrow{x}\right) \\ \vdots & \vdots & \ddots & \vdots\\ e_{-n+1}\left( \overrightarrow{x}\right) & e_{-n+2}\left( \overrightarrow{x}\right) & \cdots & e_{0}\left( \overrightarrow{x}\right) \end{array} \right) . \end{align*} Since $\delta^{t}=\delta$ (this is easy to check), this simplifies to \begin{equation} s_{\delta}\left( x_{1},x_{2},\ldots,x_{n}\right) =\det\left( \begin{array} [c]{cccc} e_{n-1}\left( \overrightarrow{x}\right) & e_{n}\left( \overrightarrow{x} \right) & \cdots & e_{2n-2}\left( \overrightarrow{x}\right) \\ e_{n-3}\left( \overrightarrow{x}\right) & e_{n-2}\left( \overrightarrow{x} \right) & \cdots & e_{2n-4}\left( \overrightarrow{x}\right) \\ \vdots & \vdots & \ddots & \vdots\\ e_{-n+1}\left( \overrightarrow{x}\right) & e_{-n+2}\left( \overrightarrow{x}\right) & \cdots & e_{0}\left( \overrightarrow{x}\right) \end{array} \right) . \end{equation} Thus, \eqref{1} rewrites as \begin{equation} \prod_{1\leq i<j\leq n}\left( x_{i}+x_{j}\right) =\det\left( \begin{array} [c]{cccc} e_{n-1}\left( \overrightarrow{x}\right) & e_{n}\left( \overrightarrow{x} \right) & \cdots & e_{2n-2}\left( \overrightarrow{x}\right) \\ e_{n-3}\left( \overrightarrow{x}\right) & e_{n-2}\left( \overrightarrow{x} \right) & \cdots & e_{2n-4}\left( \overrightarrow{x}\right) \\ \vdots & \vdots & \ddots & \vdots\\ e_{-n+1}\left( \overrightarrow{x}\right) & e_{-n+2}\left( \overrightarrow{x}\right) & \cdots & e_{0}\left( \overrightarrow{x}\right) \end{array} \right) . \end{equation} Note that the matrix on the right hand side is not generally upper-triangular, but it has a lot of zeroes (more or less the whole lower half of its southwestern triangle consists of zeroes), so its determinant is a bit easier to compute than a general $n\times n$ determinant. But I don't think there is a more explicit formula.