[Math] Spectral norm of random matrix

matricesprobability theoryrandom matricesspectral-normspectral-radius

Suppose $A$ is a $n \times n$ random matrix with centered Gaussian (real) i.i.d. entries with variance $\frac{\sigma^2}{n}$.

What to we know about the spectral norm $s(A)$ of $A$, that is $\sqrt{\rho(A^t A)}$? Here, $\rho(\cdot)$ denotes the largest eigenvalue of a matrix.

In particular, if $A$ is symmetric, we know that $s(A)$ is precisely equal to $\rho(A)$ and from the circular law it implies that $s(A)$ converges to $\sigma$ as $n \to \infty$.

Is this last statement true for non-symmetric $A$?

Best Answer

Actually, neither statement is true. The circular law does not control the spectral radius: it only predicts that the majority of eigenvalues lie in the disc, while the spectral radius is concerned with the most extreme eigenvalues. There could still be as many as $o(n)$ eigenvalues lying outside of the disc, and so it is not necessarily true that $\rho(A)$ is a.s. equal to $\sigma$ in the limit.

I believe that almost surely $s(A) \to 2\sigma$ in both the symmetric and non-symmetric cases. For the non-symmetric case in particular, see Theorem 2.1 in this survey article by Rudelson and Vershynin (from their ICM invited talk).

Related Question