Polynomials – The Sum of Squared Logarithms Conjecture

inequalitiesmatrix-theorypolynomialssymmetric-polynomials

I am searching for the first proof of (or counterexample to) the following conjecture.

(The sum of squared logarithms conjecture)
For all natural numbers $n$ and positive numbers $x_1,x_2, \ldots , x_n, y_1,y_2,\ldots, y_n>0$ such that for all
$k\in\{1,\ldots, n-1\}$
it holds

$\sum_{i_1<\ldots<i_k} x_{i_1}\, x_{i_2}\ldots x_{i_k}\le \sum_{i_1<\ldots<i_k} y_{i_1}\, y_{i_2}\ldots y_{i_k}$

and $x_1\, x_2\, x_3 \ldots x_n=y_1\, y_2 \,y_3\ldots y_n$

it follows

$\sum_{i=1}^n (\log x_i)^2\le \sum_{i=1}^n (\log y_i)^2$

Replacing the assumption $x_1\, x_2\, x_3 \ldots x_n=y_1\, y_2\, y_3\ldots y_n$ by
$x_1\, x_2\, x_3 \ldots x_n\le y_1\, y_2\, y_3\ldots y_n$ easily admits counterexamples.

Proofs are known for $n\in \{1,2,3,4\}$. More information can be found at

https://www.uni-due.de/mathematik/ag_neff/log_conjecture

Immediately after Lev Borisov's sketch of a proof idea below, Lev Borisov, Suvrit Sra, Christian Thiel and myself agreed to work out the details and to write together a complete and self-contained paper on the sum of squared logarithm conjecture and relations to other topics which can be found at: http://arxiv.org/abs/1508.04039
${}{}{}$

As announced in my first post, the prize winner is Lev Borisov.

Best Answer

Is there anything wrong with the following argument?

First of all, by scaling all $x_i$ and all $y_i$ by a positive constant, we may safely assume that $\prod_i x_i = \prod_i y_i =1$.

The result would now follow from the following more general conjecture.

$\bf Conjecture:$ For ${\bf a}=(a_1,\ldots,a_{n-1})\in (\mathbb R_{>0})^{n-1}$ consider $h_{\bf a}(z) = z^n+a_{n-1}z^{n-1}+\cdots+a_1z+1$. Define the function $f:(\mathbb R_{>0})^{n-1}\to {\mathbb R}$ by $$ f(a_1,\ldots,a_{n-1}) = \sum_{z\vert h_{\bf a}(z)=0} (\log(-z))^2 $$ where the branch of $\log$ is picked as usual on the complement of ${\mathbb R}_{\leq 0}$ in $\mathbb C$. Then all partial derivatives $\frac{\partial f}{\partial a_k}$ are positive.

First, a few comments. The function $f$ is well-defined because none of the roots of $h_a(z)$ are positive reals. In addition, $f$ is real-valued, because roots of $h_a(z)$ come in complex conjugate pairs for which values of $\log(-z)$ are complex conjugates. The consequence of this conjecture is that if ${\bf a}\leq {\bf b}$ coordinate-wise, then $f({\bf a})\leq f({\bf b})$. Applied to the case when roots of $h_{\bf a}(z)$ and $h_{\bf b}(z)$ are real numbers $-x_i$ and $-y_i$, we get the original conjecture.

Now, let me describe what I think is the proof of this new conjecture.

First of all, as standard, I can write the function $f(\bf a)$ by a contour integral. In a neighborhood of fixed ${\bf a}$ for any large enough $R$ and small enough $\epsilon>0$ there holds $$ f({\bf a}) = \frac 1{2\pi i}\int_{C} (\log(-z))^2 \frac {h_{\bf a}'(z)}{h_{\bf a}(z)}\,dz $$ over the contour $C$ on the union of $\mathbb C$ with two copies of $\mathbb R_{>0}$ which I will call "north" and "south" shores (so that $\log(-t)=\log t -\pi i$ on the north shore and $\log(-t)=\log t +\pi i$ on the south shore). The contour $C$ is the union of the following four pieces $C_\epsilon$, $C_R$, $C_+$, $C_-$.

$C_\epsilon$ is a circle of radius $\epsilon$ around $z=0$ traveled from $\epsilon$-south to $\epsilon$-north clockwise.

$C_R$ is a circle of radius $R$ around $z=0$ traveled from $R$-north to $R$-south counterclockwise.

$C_+$ is the line segment $[\epsilon,R]$-north.

$C_-$ is the line segment $[R,\epsilon]$-south.

The derivative $\frac{\partial f({\bf a})}{\partial a_k}$ is the integral of the derivative, so we get: $$ \frac{\partial f({\bf a})}{\partial a_k}= \frac 1{2\pi i}\int_{C} (\log(-z))^2 \frac \partial{\partial a_k}\frac {h_{\bf a}'(z)}{h_{\bf a}(z)}\,dz $$ $$ = \frac 1{2\pi i}\int_{C} (\log(-z))^2 \Big(\frac {z^k} {h_{\bf a}(z)} \Big)'\,dz = -\frac 1{2\pi i}\int_{C} \Big((\log(-z))^2 \Big)'\frac {z^k} {h_{\bf a}(z)} \,dz $$ $$ =-\frac 1{\pi i}\int_{C} \log(-z)\frac {z^{k-1}} {h_{\bf a}(z)}\,dz = -\frac 1\pi {\rm Im}\Big(\int_{C} \log(-z)\frac {z^{k-1}} {h_{\bf a}(z)}\,dz\Big) $$ We can take a limit as $R\to +\infty$ and $\epsilon\to 0$. Since $k\leq {n-1}$ the integral over $C_R$ goes to zero (the length is $2\pi R$ and the size of the function is $O(R^{-2}\log R)$). The integral over $C_\epsilon$ also goes to zero, because the $k>=1$, so the function is $O(\log \epsilon)$ and the length is $2\pi\epsilon$.

So we get $$ \frac{\partial f({\bf a})}{\partial a_k} = -\lim_{\epsilon\to 0^+}\frac 1\pi {\rm Im}\Big( \int_{[\epsilon,+\infty]-{\rm north}\cup [+\infty,\epsilon]-{\rm south}} \log(-z)\frac {z^{k-1}} {h_{\bf a}(z)}\,dz\Big) $$ $$ =-\lim_{\epsilon\to 0^+}\frac 1\pi \int_{\epsilon}^{+\infty}{\rm Im}\Big( (\log(t)-\pi i)\frac {t^{k-1}} {h_{\bf a}(t)} -(\log(t)+\pi i)\frac {t^{k-1}} {h_{\bf a}(t)}\Big)\,dt $$ $$ =2\lim_{\epsilon \to 0^+} \int_{\epsilon}^{+\infty} \frac {t^{k-1}}{h_{\bf a}(t)}\,dt >0.$$ This finishes the proof of the new conjecture, and consequently of the old conjecture.

Remark: I am guessing that there is a simpler argument for $$ \frac{\partial f({\bf a})}{\partial a_k} = 2\int_{0}^{+\infty} \frac {t^{k-1}}{h_{\bf a}(t)}\,dt $$ but I am just writing the first thing that came to my mind.

Related Question