Eigenvector-Eigenvalue Formula – Neutrino Studies and Consequences

linear algebramatrix analysismatrix-theory

This article describes the discovery by three physicists, Stephen Parke of Fermi National Accelerator Laboratory, Xining Zhang of the University of Chicago, and Peter Denton of Brookhaven National Laboratory, of a striking relationship between the eigenvectors and eigenvalues of Hermitian matrices, found whilst studying neutrinos.

Their result has been written up in collaboration with Terence Tao here. From what I understand, although very similar results had been observed before, the link to eigenvector computation had not been explicitly made prior to now. For completeness here is the main result, "Lemma 2" from their paper:

Let $A$ be a $n × n$ Hermitian matrix with eigenvalues $\lambda_i(A)$ and normed eigenvectors $v_i$.
The elements of each eigenvector are denoted $v_{i,j}$. Let $M_j$ be the
$(n − 1) × (n − 1)$ submatrix of $A$ that results from deleting the $j^{\text{th}}$ column and the $j^{th}$ row, with eigenvalues $\lambda_k(M_j)$.

Lemma 2. The norm squared of the elements of the eigenvectors are related to the
eigenvalues and the submatrix eigenvalues,

$$|v_{i,j}|^2\prod_{k=1;k\neq i}^n(\lambda_i(A)-\lambda_k(A))=\prod_{k=1}^{n-1}(\lambda_i(A)-\lambda_k(M_j))$$

I was wondering what are the mathematical consequences of this beautiful result?

For example are there any infinite dimensional generalisations? Does it affect matrix algorithms or proofs therein? What about singular values?

Best Answer

The OP asks about generalisations and applications of the formula in arXiv:1908.03795.

$\bullet$ Concerning generalisations: I have found an older paper, from 1993, where it seems that the same result as in the 2019 paper has been derived for normal matrices (with possibly complex eigenvalues) --- rather than just for Hermitian matrices: On the eigenvalues of principal submatrices of normal, hermitian and symmetric matrices, by Peter Nylen, Tin-Yau Tam & Frank Uhlig (1993): theorem 2.2 (with the identification $b_{ij}=|u_{ij}|^2$ made at the very end of the proof).

A further generalisation to a signed inner product has been given in On the eigenvalues of principal submatrices of J-normal matrices (2011). In that case $b_{ij}=\epsilon_i\epsilon_j|u_{ij}|^2$, with $\epsilon_i=\pm 1$ the signature of the inner product: $(x,y)=\sum_i \epsilon_i x_i^\ast y_i$.

$\bullet$ Concerning applications: in the 1993 paper the theorem is used to solve the following problem: when does a normal $n\times n$ matrix $A$ with principal sub-matrices $M_j$ exist, given the sets of distinct (complex) eigenvalues $\lambda_i(A)$ of $A$ and $\lambda_k(M_j)$ of $M_j$. The answer is that the matrix $B$ with elements $$b_{ij}=\frac{\prod_{k=1}^{n-1}(\lambda_i(A)-\lambda_k(M_j))}{\prod_{k=1;k\neq i}^n(\lambda_i(A)-\lambda_k(A))}$$ should be unistochastic, meaning that $b_{ij}=|u_{ij}|^2$, where the matrix $U$ with elements $u_{ij}$ is the eigenvector matrix of $A$.


Since the 1993 paper is behind a paywall, I reproduce the relevant page:

A brief Mathematica file to test the formula is here.


Addendum: following up on the trail pointed out by Alan Edelman on Tao's blog: this 1966 paper by R.C. Thompson, "Principal submatrices of normal and Hermitian matrices", has the desired formula in the general case of normal matrices as Equation (15).

where $\theta_{ij}=|u_{ij}|^2$ when all eigenvalues $\mu_\alpha$ of $A$ are distinct (the $\xi_{ij}$'s are eigenvalues of $M_i$). The older papers mentioned in the comments below do not seem to have an explicit formula for $|u_{ij}|^2$.

Because this appears to be the earliest appearance of the eigenvector/eigenvalue identity, it might be appropriate to refer to it as "Thompson's identity",$^*$ as a tribute to professor Robert Thompson (1931-1995). It would fit in nicely with this quote from the obituary:

Among Thompson's many services to research was his help in dispelling the misinformed view that linear algebra is simple and uninteresting. He often worked on difficult problems, and as much as anyone, he showed that core matrix theory is laden with deeply challenging and intellectually compelling problems that are fundamentally connected to many parts of mathematics.

$^*$ "Thompson's identity" to distinguish from Thompson's formula

Related Question