Sign of the Sum of the Elements in a Row of a Matrix Inverse

matrices

I have a $m \times m$ matrix $\bf A$ whose elements are given by

$$a_{i,j}=\frac{-1^{j+1}}{j!}z_i^j$$

where $0<z_1<1$ and $z_i > z_{i-1}, i=2\cdots m$.

For reasons having to do with the stability of a computational scheme for solving a transient differential equation, I would like to be able to prove that the sum of the elements in the $first$ row of the $inverse$ of the matrix $\bf A$ is positive.

A related question was posed a while ago in Inverse Matrix: Sum of the elements in each row wherein it was proved that if the sums of all rows of $\bf A$ were equal to some value $k$, then the sums of each row of the inverse would be $1/k$. The underlying condition (equality of the row sums) does not hold in my case however, but if we denote the inverse by $\bf B$, then matrix multiplication yields

$$
\sum\limits_{s=1}^m b_{1s} a_{sr} =\delta_{1,r}
$$

and the summation over all columns results in

$$
\sum\limits_{r=1}^m \sum\limits_{s=1}^m b_{1s} a_{sr}
=\sum\limits_{s=1}^m \sum\limits_{r=1}^m a_{sr} b_{1s}
= \sum\limits_{s=1}^m k_s b_{1s} =1
$$

where $k_s$ is the sum of the elements of row $s$ of $\bf A$.

I would now need to prove that

$$
\sum\limits_{s=1}^m b_{1s} >0
$$

subject to the constraint

$$
\sum\limits_{s=1}^m k_s b_{1s} =1
$$

and the additional conditions $k_1>0$ and $k_{s}>k_{s-1}, s=2\cdots m$ which imply that

$$
\sum\limits_{s=1}^m k_s >0
$$

all of which follow from the definition of the elements of $\bf A$

Any help on how to proceed would be deeply appreciated. Thanks in advance.

The problem I'm facing in my attempt at a proof is that I cannot simply replace the vector $\bf b_1$ by some arbitrary vector $\bf p$ which just happens to have an inner product with the vector $\bf k$ of unity, because I can then find several elementary counterexamples (see the snapshot below for one) that satisfy all constraints, but disprove my contention.

I don't see how I can proceed without somehow relating $\bf b_1$ to $\bf k$, and this is where I'm stuck.

enter image description here

Best Answer

We want to compute the sum of the first row of $A^{-1}$; in other words, we want the first coordinate $x_1$ of the vector satisfying $$A\mathbf{x} = \mathbf{1}$$ which by Cramer's rule is $$x_1 = \det A_1 / \det A,$$ where $A_1$ is the matrix $A$ with the first column replaced by all ones.

Now we can invoke multilinearity of the determinant to factor out constants that multiple rows or columns of both $A_1$ and $A$: in particular in this way we can eliminate (1) all of the negative signs and (2) all of the factorials, leaving $$x_1 = \det \tilde A_1 / \det \tilde A$$ where $$\tilde A = \begin{bmatrix}z_1 & z_1^2 & z_1^3 & \cdots\\z_2 & z_2^2 & z_2^3 &\\\vdots & & & \ddots\end{bmatrix}$$ and $$\tilde A_1 = \begin{bmatrix}1 & z_1^2 & z_1^3 & \cdots\\1 & z_2^2 & z_2^3 &\\\vdots & & & \ddots\end{bmatrix}.$$

We can factor out $z_i$ from every row of $\tilde A$ to yield the Vandermonde matrix, and so: $$\det \tilde A = \prod_i z_i \prod_{j>i} (z_j-z_i).$$ Finally we can prove that $$\det \tilde A_1 = \left(\sum_i \frac{1}{z_i}\right)\prod_i z_i \prod_{j>i} (z_j-z_i)$$ by induction on the size of the matrix. Clearly the formula holds for a $1\times 1$ matrix. For the inductive case, we can perform elementary column operations to $\tilde A_1$ without affecting its determinant; in particular, we can subtract $z_1^2$ time the first column from the second, and $z_1$ times every other column from the subsequent column. This yields $$\det \tilde A_1 = \det \begin{bmatrix}1 & 0 & 0 & 0 & \cdots\\1 & z_2^2 - z_1^2 & z_2^2(z_2-z_1) & z_2^3(z_2-z_1) & \\1 & z_3^2 - z_1^2 & z_3^2(z_3-z_1) & z_3^3(z_3-z_1) & \\\vdots & & & & \ddots\end{bmatrix}.$$ Expanding by minors along the first row, and then pulling out a factor of $z_i-z_1$ from each row, yields \begin{align*} \det \tilde A_1 &= \left(\prod_{j>1} (z_j-z_1)\right) \det \begin{bmatrix}z_1 + z_2 & z_2^2 & z_2^3 & \cdots \\z_1 + z_3 & z_3^2 & z_3^3 & \\\vdots & & & \ddots\end{bmatrix}\\ &= \left(\prod_{j>1} (z_j-z_1)\right) \left[z_1 \det \begin{bmatrix}1 & z_2^2 & z_2^3 & \cdots \\1 & z_3^2 & z_3^3 & \\\vdots & & & \ddots\end{bmatrix} + \det \begin{bmatrix}z_2 & z_2^2 & z_2^3 & \cdots \\z_3 & z_3^2 & z_3^3 & \\\vdots & & & \ddots\end{bmatrix}\right]\\ &= \left(\prod_{j>1} (z_j-z_1)\right)\left[ \prod_i z_i \prod_{j>i>1} (z_j-z_i)\sum_{i\neq 1} \frac{1}{z_i} + \prod_{i\neq 1} z_i \prod_{j>i>1} (z_j-z_i)\right]\\ &= \left(\sum_i \frac{1}{z_i}\right)\prod_i z_i \prod_{j>i} (z_j-z_i). \end{align*} Finally we get the following expression for $x_1$, the sum of the first row of $A^{-1}$: $$x_1 = \sum_i \frac{1}{z_i}$$ which is positive since the $z_i$ are positive.