[Math] A basis of the symmetric power consisting of powers

linear algebramultilinear-algebrapolynomials

I have asked this question on math.se, but did not get an answer – I was quite surprised because I thought that lots of people must have though about this before:

Let $V$ be a complex vector space with basis $x_1,\ldots,x_n\in V$. Denote by $v_1\odot\cdots\odot v_k$ the image of $v_1\otimes\cdots\otimes v_k$ in the symmetric power $\newcommand{\Sym}{\mathrm{Sym}}\Sym^k(V)$. It is well-known that the Elements $v^{\odot k}$ for $v\in V$ generate this space (see, for instance, this answer on math.se), so they must contain a basis.

In other words, let $N=\binom{n+k-1}k$, then there must be $v_1,\ldots,v_N\in V$ with
$$\mathrm{Sym}^k V = \mathbb Cv_1^{\odot k} \oplus \cdots \oplus \mathbb C v_N^{\odot k}.$$
I am looking for an explicit description of such a basis. Is such a description known? Is there maybe even a "nice" or somewhat "natural" choice for the $v_i$ as linear combinations of the $x_i$?

Best Answer

I would look up a book on the calculus of finite differences in a multivariate setting. The claim here is to show that for any multi-index $\alpha=(\alpha_1,\ldots,\alpha_n)$ of length $k$ one can express the multiple derivative at zero $$ \left(\frac{\partial}{\partial t}\right)^\alpha \ (t_1x_1+\cdots+ t_n x_n)^k $$ as a linear combination of finite difference analogous expressions which should only involve the evaluation of $(t_1x_1+\cdots+ t_n x_n)^k$ at integer points $(t_1,\ldots,t_n)$ with nonnegative coordinates adding up to $k$. This is the same as the above candidate basis considered by you and Peter. I don't know if there exists a multivariate analogue of the Newton series. If so then this would immediately imply the wanted statement.


Edit: Apparently there is such a formula due to Lascoux and Schutzenberger, see Theorem 9.6.1 page 148 in the book "Symmetric functions and combinatorial operators on polynomials" by Alain Lascoux. Another source on the web is here. It also has the required property here which is that the number of finite differences taken is the same as the degree of the multiplying Schubert polynomial.


Edit: @Jesko you're right it is a bit more complicated than what I said. Also, the Lascoux-Schutzenberger formula might not be the simplest to use here.

First note that expressions $$ \prod_{i=1}^{n-1} (x_i-x_n)^{\beta_i}\ \times\ (kx_n)^{k-|\beta|}\ , $$ where $\beta$ ranges over multiindices with $n-1$ components and length $|\beta|\le k$, form a basis. Now you get the latter as derivatives $$ \left(\frac{\partial}{\partial t}\right)^{\beta} \ \left(t_1x_1+\cdots+ t_{n-1} x_{n-1}+\left(k-\sum_{i=1}^{n-1}t_i\right)x_n\right)^k $$ at $t=0$.

Call $f(t_1,\ldots,t_{n-1})$ the polynomial function to be hit with derivatives. One has a multivariate Newton expansion for it: $$ f(t)=\sum_{m} (t-a)^m \partial^m f(a_{11},a_{21},\ldots,a_{n-1,1}) $$ as follows. Here $a$ stands for a matrix of indeterminates $(a_{ij})$ with $1\le i\le n-1$ and $1\le j\le d$, with $d$ high enough. Let $\partial_{ij}$ denote the divided difference operator acting on functions of these indeterminates as $$ \partial_{ij} g=\frac{1}{a_{i,j+1}-a_{ij}}\left( g({\rm argument\ with\ }a_{i,j+1}\ {\rm and}\ a_{ij}\ {\rm exchanged})- g \right)\ . $$ The notation $m=(m_1,\ldots,m_{n-1})$ is for a multiindex with nonnegative entries. We also write the corresponding operator $$ \partial^m = \prod_{i=1}^{n-1} \left(\partial_{i, m_i} \cdots\partial_{i,2}\partial_{i,1}\right) $$ noting that finite difference operators concerning different groups of variables commute. Finally $$ (t-a)^m=\prod_{i=1}^{n-1} \left((t_i-a_{i,m_i})\cdots(t_i-a_{i,2})(t_i-a_{i,1})\right)\ . $$ The formula basically amounts to applying Newton's univariate formula in each coordinate direction separately. Now use this with the choice $a_{i,j}=j-1$, then take the beta derivative in the $t$'s and that should be it.

Related Question