Matrices with a certain property

functional-analysislinear algebramatrices

Suppose $A$ is a $n\times n$ matrix having the following properties

  • $A$ is invertible

  • all the entries in $A$ are either $-1$, $0$, or $1$

  • if a column has more than one non-zero entry, then the non-zero entries are strictly below the diagonal

Suppose $A^{-1}$ has the same properties. Does it follow that every row and column in $A$ must contain exactly one entry which is $1$ or $-1$?

Best Answer

We shall adopt the following notations. Let $B=A^{-1}$. Denote the standard basis by $\{e_1,\ldots,e_n\}$ and the identity matrix by $\mathbf I$. When not in bold face, the letter $I$ denotes an index set. For any matrix $M$, denote its $i$-th row by $M_{i\ast}$ and its $j$-th column by $M_{\ast j}$. When $I$ and $J$ are index sets, $M_{IJ},\ M_{iJ}$ and $M_{Ij}$ are notations for submatrix, sub row vector and sub column vector of $M$ respectively. We call a nonzero element $m_{ij}$ a right entry of $M$ if $j\ge i$ or a left entry if $i>j$.

By the given conditions, $A$ satisfies the following properties:

Property 1. $A$ cannot contain both right and left entries on the same column. If $a_{ij}$ is a right entry, $A_{\ast j}$ is a scalar multiple of $e_i$.

Proof. This follows directly from the third given condition.

Property 2. Each row of $A$ contains at most one right entry.

Proof. Suppose $a_{ij}$ and $a_{ik}$ are two right entries of $A$. By property 1, both $A_{\ast j}$ and $A_{\ast k}$ are scalar multiples of $e_i$. Hence $j$ must be equal to $k$ because $A$ is invertible.

Property 3. Every column of $A$ contains at most one left entry.

Proof. Suppose the contrary that $A$ has more than one left entries on some column $j$. Let $K=\{k: a_{kj}\ne0\}$. Then $|K|>1$ and by property 1, $a_{kj}$ is a left entry of $A$ and $b_{jk}$ is either zero or a right entry of $B$ for each $k\in K$. Since $BA=\mathbf I$, we have $$ B_{\ast K}A_{Kj} = BA_{\ast j} = (BA)_{\ast j} = e_j.\tag{1} $$ In particular, $B_{jK}A_{Kj}=1$. Since $A_{Kj}$ is entrywise nonzero, we must have $b_{jk_0}\ne0$ for some $k_0\in K$, meaning that $b_{jk_0}$ is a right entry of $B$. Therefore, $b_{jk_0}$ is the only nonzero entry on $B_{\ast k_0}$ by property 1 and also the only nonzero entry on $B_{jK}$ by property 2. In other words, $B_{\ast K}$ is in the form of $$ B_{\ast K}=\pmatrix{\ast&0&\ast\\ 0&b_{jk_0}&0\\ \ast&0&\ast}.\tag{2} $$ Let $I=\{1,2,\ldots,n\}\setminus\{j\}$ and $J=K\setminus\{k_0\}$. As $|K|>1$, $J$ is non-empty. Then $(1)$ and $(2)$ together imply that $A_{Jj}$ is a non-trivial solution of $B_{IJ}A_{Jj}=0$. Thus $B_{IJ}$ and in turn $B_{\ast J}$ have linearly dependent columns, and we arrive at a contradiction because $B$ is supposed to be invertible. $\square$

Now properties 1 and 3 imply that $A$ has at most one nonzero entry on each column. It follows from the invertibility of $A$ that $A$ has exactly one nonzero entry on each row and each column, i.e. $A=DP$ for some permutation matrix $P$ and some invertible diagonal matrix $D$.

Related Question