Does this system of infinite equations has an (almost) unique solution

complex-analysispolynomialssequences-and-seriessymmetric-polynomialssystems of equations

Let $a_1,\dots ,a_n \in \Bbb C$, consider the following system of equations
$$\begin{cases} x_1+ \cdots+ x_n=a_1 \\ {x_1}^2+\cdots+{x_n}^2=a_2 \\ \qquad \qquad \vdots \\ {x_1}^n+\cdots+{x_n}^n=a_n \end{cases}$$
Its "easy" to prove this system has a unique solution up to permutations. The reason is due to Newton identities that allow us to create an equivalent system
$$\begin{cases} e_1(x_1,\dots,x_n)=b_1 \\ e_2(x_1,\dots,x_n)=b_2 \\ \qquad \qquad \vdots \\ e_n(x_1,\dots,x_n)=b_n \end{cases}$$
Where $e_1, \dots ,e_n$ are the elementary symmetric polynomials and $b_1, \dots ,b_n \in \Bbb C$ are numbers which can be calculated in terms of $a_1, \dots ,a_n$. If we consider the polynomial
$$P(X)=X^n-b_1 \cdot X^{n-1}+b_2 \cdot X^{n-2} + \cdots+(-1)^n \cdot b_0$$
Then, due to Vieta's Formulas, the solution to our system are precisely the
roots of $P$ (counted with multiplicity) which are unique up to permutations. My question is the following, let $(a_n)_{n\in \Bbb N}$ be complex numbers and consider the following system of infinite equations in $l^1(\Bbb C)$
$$\begin{cases} \sum_{n \in \Bbb N} x_n = a_1
\\ \sum_{n \in \Bbb N} {x_n}^2 = a_2 \\ \qquad \quad \vdots \\ \sum_{n \in \Bbb N} {x_n}^k = a_k \\ \qquad \quad \vdots \end{cases}$$

Are there any necessary/sufficienct conditions over $(a_n)_{n\in \Bbb N}$ for a solution to exist? If $(x_n)_{n\in \Bbb N}$ is a solution to our system and $\sigma : \Bbb N \rightarrow \Bbb N$ is a bijection then $(x_{\sigma (n)})_{n\in \Bbb N}$ is a solution to our system so any permutation of a solution is again a solution. Also, if we take a solution $(x_n)_{n\in \Bbb N}$ and we "add" zeros to our sequence then we get another solution, for example, the sequence $(y_n)_{n\in \Bbb N}$ defined as
$$y_{2n}=0 \qquad ; \qquad y_{2n-1}=x_n \qquad \forall n \in \Bbb N$$
Is another solution to our system of equations. My second question would be the following. If our system of infinite equations has two solutions, $(x_n)_{n\in \Bbb N}$ and $(y_n)_{n\in \Bbb N}$, is it true that I can get $(y_n)_{n\in \Bbb N}$ by taking $(x_n)_{n\in \Bbb N}$ and adding zeros and making permutations?

Inspired by the finite case, I did the following approach. Its easy to prove (again, using Newton Identities) that if $(x_n)_{n\in \Bbb N} \in l^1(\Bbb C)$ then the following limit exists for every $k \in \Bbb N_0$
$$e_k(x):=\lim_{n \to \infty} e_k(x_1,\dots ,x_n) < \infty$$
And it can be calculated in terms of $\sum_{n \in \Bbb N} x_n , \dots , \sum_{n \in \Bbb N} {x_n}^k$ so we get the following (equivalent) system of equations
$$\begin{cases} e_1(x) = b_1
\\ e_2(x) = b_2 \\ \qquad \vdots \\ e_k(x) = b_k \\ \qquad \vdots \end{cases}$$

Where $(b_n)_{n\in \Bbb N}$ are complex numbers which can be calculated in terms of $(a_n)_{n\in \Bbb N}$. To reproduce the following step in the finite case, we will make some modifications in the approach. By Vieta's formulas, it's easy to check the following polynomial equality
$$\prod_{k=1}^n (1-x_i \cdot X) = \sum_{k=0}^n (-1)^k \cdot e_k(x_1,\dots, x_n) \cdot X^k$$
With $x_1,\dots ,x_n \in \Bbb C$. Note that the roots of this polynomial are ${x_i}^{-1}$ for every $x_i \not = 0$. I would like to say (and have no idea how to prove) that if $(x_n)_{n \in \Bbb N} \in l^1(\Bbb C)$ then
$$\prod_{k=1}^\infty (1-x_i \cdot z) = \sum_{k=0}^\infty (-1)^k \cdot e_k(x) \cdot z^k \qquad \forall \; z \in \Bbb C$$
If this is true, going back to our infinite system of equations, we could consider the series
$$f(z)=1+ \sum_{k=1}^\infty (-1)^k \cdot b_k \cdot z^k$$
Wich is a function we can "calculate" since we know $(b_k)_{k \in \Bbb N}$. We should ask some conditions over $(b_k)_{k \in \Bbb N}$ for this to be well defined over an open set around zero (or all over $\Bbb C$). Let $(r_n)_{n \in \Bbb N}$ be the roots of $f$ counted with multiplicity (Does this have a meaning???), let $x_n = {r_n}^{-1} \quad \forall n \in \Bbb N$ (Note $r_n \not = 0 \quad \forall n \in \Bbb N$ because $f(0)=1$), if $f$ has finite roots, we fill the sequence with zeros. I affirm that $(x_n)_{n \in \Bbb N}$ is a solution to our system. There are a LOT of gaps in my reasoning and Im not sure how to continue. My Complex analysis knowledge is really weak.

Best Answer

  • That $$x_n \in \Bbb{C}^*, \qquad \sum_{n=1}^\infty \frac1{|x_n|} < \infty$$ means $$f_x(z) = \sum_{n=1}^\infty \frac{1}{z-x_n}$$ converges absolutely and locally uniformly away from the $x_n$ thus it is meromorphic on $\Bbb{C}$ with simple poles at each $x_n$ of residue $\# \{m, x_m=x_n\}$.

  • For $|z| < \inf_n |x_n|$, from the absolute convergence of $\sum_{n=1}^\infty \frac{1}{|x_n|-|z|}=\sum_{n=1}^\infty \sum_{k=1}^\infty \frac{|z|^{k-1}}{|x_n|^k}$ we obtain $$f_x(z) = -\sum_{k=1}^\infty z^{k-1} \sum_{n=1}^\infty \frac1{x_n^k}$$

  • If $\sum_{n=1}^\infty \frac1{|y_n|} < \infty$ and $$\forall k, \sum_{n=1}^\infty \frac1{x_n^k}= \sum_{n=1}^\infty \frac1{y_n^k}$$ then $f_x(z)=f_y(z)$ for $|z| < \inf_n \min (|x_n|,|y_n|)$ thus by the identity theorem $f_x=f_y$ and hence $(y_n)$ is a permutation of $(x_n)$.

Related Question