[Math] known about the distribution of eigenvectors of random matrices

linear algebramatricespr.probabilityrandom matricesreference-request

Let $A$ be a real asymmetric $n \times n$ matrix with i.i.d. random, zero-mean elements. What results, if any, are there for the eigenvectors of $A$? In particular:

  • How are individual eigenvectors distributed (probably zero-mean multi-variate Normal, but what is the covariance)?
  • If $u_i$ and $u_j$ are eigenvectors of $A$, what is the distribution of $|u_i^*u_j|$, or, even better, the $n^2$-d joint distribution $P(u_1, … u_n)$?
  • What is the joint distribution of eigenvalues and their corresponding eigenvectors (or, perhaps more in line with my application described below, the conditional distribution of an eigenvector given an eigenvalue)?
  • Numerically, I've found that every eigenvector corresponding to a complex eigenvalue has a single real element. (Naturally, real eigenvalues have corresponding real eigenvectors.) Has this been proven? What is the distribution of the number of real eigenvalues of $A$?

Note: I'm not a mathematician, but a physicist working in dynamical systems, and I skipped nuclear so my knowledge of GUE/GOE results is limited to basically the circular laws. I'm really interested in constructing random real matrices $A = VDV^{-1}$ where $D$ is a diagonal matrix of eigenvalues drawn from a distribution that I control and differs from the one given by the various circular laws, and $V$ is the matrix of eigenvectors drawn from the conditional distribution of eigenvectors of random matrices given their corresponding eigenvalue. So this question can be summarized: how do I draw $V$? I don't imagine that there are complete answers to my questions yet, but any insights along those lines that will help me draw $V$ "realistically" would be appreciated. Heck, I just realized bullets two and three may have somewhat incompatible assumptions: bullet three (or, rather, my proposed application) assumes that $P(u_1, … u_n | \lambda_1, …, \lambda_n) = P(\lambda_1,…,\lambda_N) \prod_i P(u_i | \lambda_i)$ where $i$ ranges over a single member of each complex conjugate pair of eigenvalues, where two makes no such assumptions and just jumps to $\int P(u_1, … u_n, \lambda_1, …, \lambda_n) d\vec{\lambda}$ where $\vec{\lambda}$ is circular law distributed.

If that seems like a weird application, my motivation is to study the influence of only the eigenvalues of the adjacency matrix of a dynamical process that takes place on a random network. A simple first attempt at this by drawing $A$, performing SVD on it to get $V$, and mucking with $D$ only gives either interesting or disastrous results depending on how you look at it. A still simple but second attempt (which to my mind seems like it should work if the conditional independence assumption holds) of drawing several random $A$'s and choosing eigenvalues and corresponding eigenvectors from them according to my desired distribution is even more disastrous (but no more interesting, I think).

Best Answer

If you choose the matrix elements of $A$ independently from a Gaussian distribution you have the socalled Ginibre ensemble of random-matrix theory. The statistics of the eigenvalues is known, see for example Eigenvalue statistics of the real Ginibre ensemble. The statistics of the eigenvectors, and the eigenvector-eigenvalue correlations, have been much less studied, I know of just a few papers:

  1. Eigenvector statistics in non-Hermitian random matrix ensembles

  2. Statistical properties of eigenvectors in non-Hermitian Gaussian random matrix ensembles

  3. Correlations of eigenvectors for non-Hermitian random-matrix models

While in the Hermitian ensembles (GOE, GUE) the eigenvectors of different eigenvalues are independent, in the non-Hermitian ensemble eigenvectors are highly correlated if the two eigenvalues lie close in the complex plane.

Related Question