[Math] Power iteration

eigenvalues-eigenvectorsnumerical linear algebranumerical methods

If $A$ is a matrix you can calculate its largest eigenvalue $\lambda_1$. What are the exact conditions under which the power iteration converges? Power iteration

Especially, I often see that we demand that the matrix is symmetric? Is this necessary?
What seems to be indespensable is that there is a largest eigenvalue (absolute value is large). But what about the structure of the eigenspace of this eigenvalue?

Apparently, many times it is not considered that the eigenspace to this largest eigenvalue does not need to be one-dimensional. So what happens if the eigenspace is two-dimensional. Can we still use this algorithm?

Best Answer

A sufficient condition for convergence is that (1) there is exactly one eigenvalue with the maximal absolute value of all eigenvalues, and (2) the starting vector has non-zero component in the associated eigenspace.

If $\lambda_1\dots \lambda_m$ are the distinct eigenvalue, condition (1) means: $$ |\lambda_1| > |\lambda_j| $$ for all $j=2\dots m$. It is not necessary to assume that the eigenspace is one-dimensional. The Wiki-page already has the convergence proof.

Surb's second matrix $A$ is not a counterexample, as $A$ has eigenvalues $+1$ and $-1$.

The assumptions are also not fulfilled for a real matrix without real eigenvalues. If the starting vector is pure real, then all iterates are pure real, and cannot generate a non=real eigenvalue.