[Math] Eigenvalues, singular values, and the angles between eigenvectors

linear algebra

Suppose the $n \times n$ matrix $A$ has eigenvalues $\lambda_1, \ldots, \lambda_n$ and singular values $\sigma_1, \ldots, \sigma_n$. It seems plausible that by comparing the singular values and eigenvalues we gets some sort of information about eigenvectors. Consider:

a. The singular values are equal to the absolute values of eigenvalues if and only if the matrix is normal, i.e., the eigenvectors are orthogonal (see http://en.wikipedia.org/wiki/Normal_matrix , item 11 of the "Equivalent definitions" section ).

b. Suppose we have two distinct eigenvalues $\lambda_1, \lambda_2$ with eigenvectors $v_1, v_2$. Suppose, hypothetically, we let $v_1$ approach $v_2$, while keeping all the other eigenvalues and eigenvectors the same. Then the largest singular value approaches infinity. This follows since $\sigma_{\rm max} = ||A||_2$ and $A$ maps the vector $v_1 – v_2$, which approaches $0$, to $\lambda_1 v_1 – \lambda_2 v_2$, which does not approach $0$.

It seems reasonable to guess that the “more equal'' $|\lambda_1|, \ldots, |\lambda_n|$ and $\sigma_1, \ldots, \sigma_n$ are, the more the eigenvectors look like an orthogonal collection. So naturally my question is whether there is a formal statement to this effect.

Best Answer

We can gain some insight into the question if we think in terms of change of basis and the operator norm $\|A\| = \sigma_1$. I assume that we've arranged the eigenvalues so that $|\lambda_1| \geq \ldots \geq |\lambda_n| > 0$. So $A = E\Lambda E^{-1}$, with $\Lambda$ having the eigenvalues in order on the diagonal, and $E$ being the matrix whose columns are the corresponding eigenvectors. We can consider $E^{-1}$ to be the matrix that represents a coordinate transform, bringing the eigenvectors to the standard basis.

Suppose the singular values of $E$ are $s_1, \ldots, s_n$. It follows that the singular values of $E^{-1}$ are $s_n^{-1}, \ldots, s_1^{-1}$. Since $E^{-1}$ is not an isometry, the matrix norm of $A$ is not the same as $\|\Lambda\| = |\lambda_1|$. In fact we have $$ \sigma_1 = \|A\| \leq \|E\| \|\Lambda\| \|E^{-1}\| = s_1 |\lambda_1| s_n^{-1}. $$ The quantity $\kappa(E) = \kappa(E^{-1}) = s_1 s_n^{-1}$ is the condition number of the change of basis matrix $E^{-1}$, and it coincides with the condition number of $E$ itself. The condition number gives a measure of how close $E$ is to being degenerate. For an orthogonal matrix the condition number is 1, but as you let two columns of $E$ "approach" each other, the condition number of $E$ increases. If you think of the columns of $E$ as forming the sides of a box (a parallelotope), then the condition number is a measure of how squashed the box is (this assertion could be made precise).

Anyway, you have that $$ \kappa(E)^{-1}|\lambda_1| \leq \sigma_1 \leq \kappa(E) |\lambda_1|, $$ which is at least a partial answer to your question.