A quadratic inequality related to the $\log$ function

inequalitymachine learningpositive-semidefinitequadratic-forms

Suppose that $c_1,c_2,\cdots,c_n \in \mathbb{R}$ such that their sum is $0$.Show that for all positive real number $x_1,x_2,\cdots,x_n$ the following inequality holds:
$$ \sum\limits_{i,j=1}^{n} c_ic_j \log(x_i+x_j) \leq 0$$

Where this problem comes from: the original problem is related to the concept of kernel in the field of machine learning. The problem asks me to show that the kernel $K(x,y) := \log(x+y), x,y>0$ is a NDS(negative definite symmetric) kernel, which is equivalent to the problem stated at the beginning.

A result in machine learning states that $K$ is a NDS kernel iff $\exp(-tK)$ is a PDS kernel for all $t>0$, thus the problem can be reduced to showing that
$$[(x_i+x_j)^{-t}]_{1\leq i,j \leq n}$$
is a positive semi-definite matrix. Furthermore, we only need to prove this result for $0<t<1$ because the matrix $$[(x_i+x_j)^{-1}]_{1\leq i,j \leq n}$$ is positive semi-definite(as a special case of the Cauchy matrix), and if $(a_{ij})$and $(b_{ij})$ are positive semi-definite then $(a_{ij}b_{ij})$ is also positive semi-definite.

QUESTION

How to prove the above inequality, either directly or by showing instead that the generalized Cauchy matrix is positive semi-definite?

Best Answer

A quick way of seeing this might be to note that $$\int_0^\infty \frac{e^{-t} - e^{-at}}{t}\,dt = \log a \,, \forall a > 0.$$

So the quadratic form above can be written as \begin{align*} \sum\limits_{i,j=1}^{n} c_ic_j \log(x_i + x_j) &= \sum\limits_{i,j=1}^{n} c_ic_j \int_0^\infty \frac{e^{-t} - e^{-(x_i+x_j)t}}{t}\,dt \\&= \int_0^\infty \frac{1}{t}\left(e^{-t}\left(\sum_{j=1}^n c_j\right)^2 - \left(\sum_{j=1}^n c_je^{-x_jt}\right)^2\right)\,dt \\&= -\int_0^{\infty} \frac{1}{t} \left(\sum_{j=1}^n c_je^{-x_jt}\right)^2 \,dt \le 0\end{align*} where, we used the fact that $\displaystyle \sum_{j=1}^n c_j = 0$.