In the arXiv paper "Schoenberg matrices of radial positive definite functions
and Riesz sequences in $L^2(\mathbb{R}^n)$" by L. Golinskii, M. Malamud and L. Oridoroga, Definition 1.1 states
Let $n \in \mathbb{N}$. A real-valued and continuous function $f$ on $\mathbb{R}_+ = [0,\infty)$ is called a radial positive definite function, if for an arbitrary finite set $\{x_1,\ldots,x_m\}$, $x_k \in \mathbb{R}^n$, and $\{\xi_1,\ldots,\xi_m\} \in \mathbb{C}^m$ $$\sum_{k,j = 1}^{n}f(|x_k-x_j|)\xi_k\overline{\xi}_j \ge 0.$$
Note that this definition is equivalent to saying that the matrix $S \in \mathbb{R}^{m \times m}$ defined by $S(k,j) = f(\|x_k-x_j\|)$ is positive semidefinite for any choice of points $x_1,\ldots,x_m \in \mathbb{R}^n$.
Also, Theorem 1.2 states the following:
A function $f \in \Phi_n$, $f(0) = 1$ if and only if there exists a probability measure $\nu$ on $\mathbb{R}_+$ such that $$f(r) = \int_{0}^{\infty}\Omega_n(rt)\nu(dt), ~~~~ r \in \mathbb{R}_+$$ where $$\Omega_n(s) := \Gamma(q+1)\left(\dfrac{2}{s}\right)^qJ_q(s) = \sum_{j = 0}^{\infty}\dfrac{\Gamma(q+1)}{j!\Gamma(j+q+1)}\left(-\dfrac{s^2}{4}\right)^j, ~~~~ q = \dfrac{n}{2}-1,$$ $J_q$ is the Bessel function of the first kind and order $q$.
(In their paper $\Phi_n$ denotes the set of radial positive definite functions on $\mathbb{R}^n$.)
Theorem 1 in "Metric Spaces and Completely Monotone Functions" by I. J. Schoenberg gives the proof of this result.
By applying this theorem for $n = 3$ and $\nu = \delta_1$, you get that $f(r) = \Omega_3(r) = \dfrac{\sin r}{r}$ is a radial positive definite function in $\mathbb{R}^3$. In other words, for any choice of $m$ points $x_1,\ldots,x_m \in \mathbb{R}^3$, the matrix $S \in \mathbb{R}^{m \times m}$ defined by $S(k,j) = \dfrac{\sin(\|x_k-x_j\|)}{\|x_k-x_j\|}$ is positive semidefinite.
It should be clear that if you choose $m$ points $x_1,\ldots,x_m$ in $\mathbb{R}$ or $\mathbb{R}^2$, then $S$ is still guaranteed to be positive semidefinite. However, that is not the case if the points are chosen in $\mathbb{R}^n$ where $n \ge 4$. joriki's answer gives a nice counterexample for $n \ge 5$. Here is an ugly counterexample (found by randomly generating points in MATLAB) with $6$ points in $\mathbb{R}^4$:
$x_1 = (-0.7,0.8,-0.3,-0.4)$
$x_2 = (0.8,-0.8,-0.9,0.7)$
$x_3 = (-0.1,0.0,0.1,-0.1)$
$x_4 = (-0.8,-0.7,-0.1,0.2)$
$x_5 = (0.2,-0.2,0.5,-0.9)$
$x_6 = (0.4,0.8,0.8,0.9)$
You can check that the $6 \times 6$ sinc distance matrix $S$ for these points has an eigenvalue of $\approx -0.0024103$, and thus, $S$ is not positive semi definite.
Best Answer
Actually, this statement is certainly not true. (I only realized this after posting.) A simple proof follows from the basic fact that this statement is equivalent to asking if the convex hull of (positive semidefinite) diagonally dominant matrices is equal to the set of positive semidefinite matrices.
But the set of diagonally dominant matrices is already a convex set! This is because it can be written as the intersection convex sets, since $$ \mathcal{D} = \left\{A \in \mathbf{R}^{n\times n} \mid A_{ii} \ge \sum_{j\ne i} |A_{ij}|, ~ i=1, \dots, n\right\}, $$ which is just the intersection of a bunch of affine inequalities. So, its convex hull can be no larger than the set itself, and, because we know that $\mathcal{D} \subsetneq \mathbf{S}_+^n$, the result follows.