[Math] Is a sinc-distance matrix positive semidefinite

linear algebramatricespositive-semidefinitequadratic programming

I've been trying to crack this problem for days but I can't find a way around it. Given a set of unique $N$ points $X = \{x_1,\dots,x_N\}, x_i \in R^3$, the associated sinc-distance matrix $S \in R^{n\times n}$ is $S(i,j) = \sin(|x_i-x_j|)/(|x_i-x_j|)$. My question is if this matrix is positive semidefinite.

I've ran numerical tests and the matrix always appears PSD, but I haven't been able to prove it formally.

Things I've tried:

  • Try to prove it using minors and so: no luck, sinc's are difficult to work with in order to simplify the form of the determinants.
  • Try proving that $\forall y$, $y^TSy\geq 0$, that ends up being $\sum_{i=1}^N \sum_{j=1}^N y_i y_j \sin(|x_i-x_j|)/(|x_i-x_j|) \geq 0$. No luck either. My intuition is that the triangle inequality for distances has to play a role somewhere in it, but can't find it.
  • Try a constructive approach. We know that for the $N=2$ case, the matrix is PD, as the diagonal elements are 1 (sinc of 0) and the non-diagonal are $<1$, so the matrix is strictly diagonally dominant (also the determinant and trace are both positive, so $S \succ 0$. Then prove that adding an element to the set doesn't make the resulting matrix indefinite. I've tried using Schur complements for this task but no luck yet.
  • Also for any set $X$, we can multiply all elements by a scalar $\alpha$. As $\alpha$ goes to 0, all the elements in the set also go to 0 and $S \to \vec{1} \vec{1}^T$, which is PSD rank 1. As $\alpha$ goes to infinity, so do the points and so do the distances, so $S \to I$. So changing $\alpha$, the matrix moves in a 1D curve in the group $\mathcal{S}$ of symmetric matrices and the curve starts in the boundary of the cone of PSD matrices and ends in the middle of the cone, so it seems natural that it doesn't leave the cone, but I can't prove it.

Any help would be greatly appreciated! Looking forward to the discussion!

Thanks!

Best Answer

In the arXiv paper "Schoenberg matrices of radial positive definite functions and Riesz sequences in $L^2(\mathbb{R}^n)$" by L. Golinskii, M. Malamud and L. Oridoroga, Definition 1.1 states

Let $n \in \mathbb{N}$. A real-valued and continuous function $f$ on $\mathbb{R}_+ = [0,\infty)$ is called a radial positive definite function, if for an arbitrary finite set $\{x_1,\ldots,x_m\}$, $x_k \in \mathbb{R}^n$, and $\{\xi_1,\ldots,\xi_m\} \in \mathbb{C}^m$ $$\sum_{k,j = 1}^{n}f(|x_k-x_j|)\xi_k\overline{\xi}_j \ge 0.$$

Note that this definition is equivalent to saying that the matrix $S \in \mathbb{R}^{m \times m}$ defined by $S(k,j) = f(\|x_k-x_j\|)$ is positive semidefinite for any choice of points $x_1,\ldots,x_m \in \mathbb{R}^n$.

Also, Theorem 1.2 states the following:

A function $f \in \Phi_n$, $f(0) = 1$ if and only if there exists a probability measure $\nu$ on $\mathbb{R}_+$ such that $$f(r) = \int_{0}^{\infty}\Omega_n(rt)\nu(dt), ~~~~ r \in \mathbb{R}_+$$ where $$\Omega_n(s) := \Gamma(q+1)\left(\dfrac{2}{s}\right)^qJ_q(s) = \sum_{j = 0}^{\infty}\dfrac{\Gamma(q+1)}{j!\Gamma(j+q+1)}\left(-\dfrac{s^2}{4}\right)^j, ~~~~ q = \dfrac{n}{2}-1,$$ $J_q$ is the Bessel function of the first kind and order $q$.

(In their paper $\Phi_n$ denotes the set of radial positive definite functions on $\mathbb{R}^n$.)

Theorem 1 in "Metric Spaces and Completely Monotone Functions" by I. J. Schoenberg gives the proof of this result.

By applying this theorem for $n = 3$ and $\nu = \delta_1$, you get that $f(r) = \Omega_3(r) = \dfrac{\sin r}{r}$ is a radial positive definite function in $\mathbb{R}^3$. In other words, for any choice of $m$ points $x_1,\ldots,x_m \in \mathbb{R}^3$, the matrix $S \in \mathbb{R}^{m \times m}$ defined by $S(k,j) = \dfrac{\sin(\|x_k-x_j\|)}{\|x_k-x_j\|}$ is positive semidefinite.


It should be clear that if you choose $m$ points $x_1,\ldots,x_m$ in $\mathbb{R}$ or $\mathbb{R}^2$, then $S$ is still guaranteed to be positive semidefinite. However, that is not the case if the points are chosen in $\mathbb{R}^n$ where $n \ge 4$. joriki's answer gives a nice counterexample for $n \ge 5$. Here is an ugly counterexample (found by randomly generating points in MATLAB) with $6$ points in $\mathbb{R}^4$:

$x_1 = (-0.7,0.8,-0.3,-0.4)$

$x_2 = (0.8,-0.8,-0.9,0.7)$

$x_3 = (-0.1,0.0,0.1,-0.1)$

$x_4 = (-0.8,-0.7,-0.1,0.2)$

$x_5 = (0.2,-0.2,0.5,-0.9)$

$x_6 = (0.4,0.8,0.8,0.9)$

You can check that the $6 \times 6$ sinc distance matrix $S$ for these points has an eigenvalue of $\approx -0.0024103$, and thus, $S$ is not positive semi definite.

Related Question