Show the eigenvalues of a square matrix are between zero and one.

eigenvalues-eigenvectorslinear algebramatricesstatisticssymmetric matrices

Let $A$ and $B$ be two positive definite matrices of orders $n$ and $d,$ respectively. Let $X$ be a matrix of type $(n,d)$ ($n$ rows and $d$ columns).

Is it true that $A^{-1} X B^{-1} X^\intercal$ have eigenvalues between zero and one?

I am unsure if the result holds with this generality, so I am giving more context below.

This problem arises in the study of "contingency tables" (i.e. statistics) where $X$ is a design matrix of zeros and ones, where each row represents an observation and each column represents the category or cell where this observation falls. Thus, $x_{ij} = 1$ means that the $i$th object appeared in the $j$th cell and if $x_{ij} = 0$ it means taht the object did not appear. It is assumed that at least one entry of every row and every column will be one (the possibility of different objects being in different categories is OK). The matrices $A$ and $B$ are obtained by normalising the rows and columns of $X.$ That is, define
$$
x_{i \cdot} =\sum_{j = 1}^d x_{ij}, \quad x_{\cdot j} = \sum_{i = 1}^n x_{ij}
$$

and set $A = \mathrm{diag}(x_{i \cdot}),$ $B = \mathrm{diag}(x_{\cdot j}).$

I have made more than a few attemps without success but neither of these have really used the structure of the problem. For example, if $\lambda$ is an eigenvalue of the matrix $A^{-1} X B^{-1} X^\intercal$ then any eigenvector of $\lambda$ will satisfy $XB^{-1} X^\intercal = \lambda A v.$ It is tempting to multiply on the left by $v^\intercal$ and us the fact that boths sides are now weighted norms of $v.$ This already shows that $\lambda \geq 0.$ But the algebra becomes messy and I could not see a path forward to show $\lambda \leq 1.$ On top of this, I am not really using the structure of $X, A$ and $B$ as I already mentioned.

A perhaps useful fact (that I have not been able to exploit) is that the vector $\mathbf{1}$ (full of ones) has eigenvalue 1 and any other eigenvector is orthogonal to this one!

If you have ideas, hints, know a reference or how to do it, I'd appreaciate the help.

Best regards, W.

Best Answer

In the particular case you have, it can be shown as follows. For any matrix $\Lambda$ with eigenvalues $(\lambda_1,\dotsc,\lambda_n)$, one can bound the spectral radius $$ \rho(\Lambda):=\max\{|\lambda_1|,\dotsc,|\lambda_n|\} $$ by any operator norm of $\Lambda$, that is, $\rho(\Lambda)\le\|\Lambda\|$. These norms satisfy the inequality $$\|\Lambda\Lambda'\|\le\|\Lambda\|\|\Lambda'\|.$$

I'll choose the norm $$ \|\Lambda\|_\infty:=\max_i\sum_j|\Lambda_{ij}|, $$ and let $\Lambda:=A^{-1}X$, $\Lambda':=B^{-1}X^\top$.

Note that we simply have $$\Lambda_{ij}=\frac{X_{ij}}{\sum_{\ell=1}^d X_{i\ell}},\quad\Lambda'_{ij}=\frac{X_{ji}}{\sum_{\ell=1}^n X_{\ell i}}.$$ Hence, since all the entries are positive we have $$\|\Lambda\|_\infty=\max_{i=1,\dotsc,d}\sum_{j=1}^n\frac{X_{ij}}{\sum_{\ell=1}^n X_{i\ell}}=1,$$ and in the same way $$\|\Lambda'\|_\infty=\max_{i=1,\dotsc,n}\sum_{j=1}^d\frac{X_{ji}}{\sum_{\ell=1}^d X_{\ell i}}=1.$$

Therefore, all eigenvalues must lie in the interval $[-1,1]$. Since you already showed that they must be non-negative, they actually all lie in $[0,1]$.

The general case is not true as pointed out by saulspatz in one of the comments above.