[Math] How to prove positivity of determinant for these matrices

determinantslinear algebra

Let $g(x) = e^x + e^{-x}$. For $x_1 < x_2 < \dots < x_n$ and $b_1 < b_2 < \dots < b_n$, I'd like to show that the determinant of the following matrix is positive, regardless of $n$:

$\det \left (\begin{bmatrix}
\frac{1}{g(x_1-b_1)} & \frac{1}{g(x_1-b_2)} & \cdots & \frac{1}{g(x_1-b_n)}\\
\frac{1}{g(x_2-b_1)} & \frac{1}{g(x_2-b_2)} & \cdots & \frac{1}{g(x_2-b_n)}\\
\vdots & \vdots & \ddots & \vdots \\
\frac{1}{g(x_n-b_1)} & \frac{1}{g(x_n-b_2)} & \cdots & \frac{1}{g(x_n-b_n)}
\end{bmatrix} \right ) > 0$.

Case $n = 2$ was proven by observing that $g(x)g(y) = g(x+y)+g(x-y)$,
and
$g(x_2 – b_1)g(x_1-b_2) = g(x_1+x_2 – b_1-b_2)+g(x_2-x_1+b_2-b_1) > g(x_1+x_2 – b_1-b_2)+g(x_2-x_1-b_2+b_1) = g(x_1-b_1)g(x_2-b_2)$

However, things get difficult for $n \geq 3$. Any ideas or tips?

Thanks!

Best Answer

At first, we prove that the determinant is non-zero, in other words, the matrix is non-singular. Assume the contrary, then by the linear dependency of the columns there exist real numbers $\lambda_1,\dots,\lambda_n$, not all equal to 0, such that $F(x_i):=\sum_j \frac{\lambda_j}{g(x_i-b_j)}=0$ for all $i=1,2,\dots,n$. But the equation $F(x)=0$ is a polynomial equation with respect to $e^{2x}$ and the degree of a polynomial is less than $n$. So, it can not have $n$ distinct roots.

Now we note that the matrix is close to an identity when $x_i=b_i$ and $b_i$'s are very much distant from each other, and the phase space of parameters $\{(x_1,\dots,x_n,b_1,\dots,b_n):x_1<\dots<x_n,b_1<b_2<\dots <b_n\}$ is connected. Thus the sign of the determinant is always plus.

Related Question