I'm looking for a class of matrices such that if it contains a matrix with only positive entries then the inverse of said matrix also has only positive entries. I imagine an example of such a class would be the class of orthogonal matrices where the inverse is the transpose but i'm looking for a more general class if possible.
Matrix with only positive entries whose inverse has only positive entries
matrices
Related Solutions
First of all, the original poster’s example with upper triangular matrices is not exactly correct. We know certainly that lower-left $N^2 - N$ entries are zero, and (as for any invertible triangular matrix) that all $N$ diagonal entries are not zero, but we are generally ignorant about $N^2 - N$ upper-right entries. We do not have an algorithm that decides whether $(A^{-1})_{jk} = 0,\ j < k$ significantly easier than an algorithm just computing that $(A^{-1})_{jk}$.
As for square matrices in general, three comments under the question already stated that in special cases the answer might be possible. Is it possible in the general case? Henceforth I exclude the question “2.” and will consider only the problem about $(A^{-1})_{jk} = 0$ for concrete $j$ and $k$.
Let $F$ be the ground field, such as rational, real, or complex numbers. Then, $A: F^N \to F^N$ and the same for $A^{-1}$, if it is well-defined. Let $({\mathbf e}_1, {\mathbf e}_2,\ldots {\mathbf e}_N)$ be the standard basis in $F^N$ and $D^N_j \subset F^N$ be a linear hyperplane spanned of all elements of the standard basis except ${\mathbf e}_j$. In other words, $D^N_j$ consists of all $N$-vectors with $j$th coordinate equal to zero. Obviously, $$ (A^{-1})_{jk} = 0\quad \Leftrightarrow \quad A^{-1}\,{\mathbf e}_k \in D^N_j\quad \Leftrightarrow \quad {\mathbf e}_k \in A\,D^N_j$$ (from this point on I suppose that $A$ is invertible). What is the image space $A\,D^N_j$? It is the linear span of vectors $\{\ A\,{\mathbf e}_l\ |\ l=1\ldots N,\ l\ne j\ \}$. How can we check whether does ${\mathbf e}_k$ belong to $A\,D^N_j$? For a general matrix (invertible, but of otherwise no special form), by finding an appropriate linear combination of said vectors, no simpler, since we have linear span of arbitrary $N-1$ linearly independent vectors (remind that $A$ is invertible). This combination must be unique and coefficients in it shall be exactly entries $\{\ (A^{-1})_{lk}\ |\ l=1\ldots N,\ l\ne j\ \}$, that follows from the rule of matrix-by-vector multiplication: $$ A^{-1}\,{\mathbf e}_k = \sum\limits_{l=1}^N (A^{-1})_{lk}{\mathbf e}_l.$$ Although not a formal proof, the reasoning shows that we can’t find anything about an entry of the inverse matrix in a way much simpler than computing an entire column of $A^{-1}$. Of course, we can consider right multiplication of row vectors instead of left multiplication of column vectors, but this doesn’t change the conclusion that finding whether an isolated matrix entry of $A^{-1}$ is zero or not requires about $1/N$ of total $A^{-1}$ computation job.
What you are suggesting is that the (necessary and sufficient) conditions
- $A$ is a Z-matrix (matrix with non-positive off-diagonal entries),
- all the principal minors of $A$ are positive,
for an $n\times n$ matrix $A$ to be an M-matrix are equivalent to (or implied by) the conditions
- $A$ is a Z-matrix (strict negativity of the off-diagonal entries is unnecessary due to the continuity of the determinant),
- all $1\times 1$ principal minors and $\det(A)$ are positive.
Note that the positivity of $1\times 1$ minors means that $A$ has positive diagonal.
This is true trivially if $n=1$ or $n=2$. It is also true if $n=3$ (positivity of diagonal and non-negativity of off-diagonal entries implies positivity of $2\times 2$ minors). However, this fail to be the case when $n\geq 4$. For example, $$ A=\left(\begin{array}{rrrr} 1 & -2 & -1 & -3\\ -1 & 2 & -5 & -2\\ -1 & -5 & 1 & -1\\ -5 & -1 & -1 & 3 \end{array}\right). $$ Note that the determinants of both the leading and trailing $3\times 3$ submatrices are negative but $\det(A)=30>0$.
Best Answer
If I understood your question correctly, you can find what you're looking for in "When a Matrix and Its Inverse Are Nonnegative" by J. Ding and N. H. Rhee, where theorem 5.1 states:
"A matrix and its inverse are nonnegative matrices if and only if it is the product of a diagonal matrix with all positive diagonal entries and a permutation matrix."