[Math] Dominant Eigenvalues

eigenvalues-eigenvectorsmatrices

I'm trying to understand dominant eigenvalues and I found this website that has a explanation of it (Definition 9.2). In the example, the power method is used to find the dominant eigenvector which correspondes to the eigenvalue of 1.

When I calculate the eigenvalues and vectors of the matrix in the example, I got this result:

enter image description here

The first line are the eigenvalues and the second row the eigenvectors.
As you can see, the eigenvector that corresponde to to the eigenvalue of 1 is

{-0.577, -0.577, -0.577}

If I calculate the powers of the matrix, I find that after M^9, it converges as shown in the website

I don't understand what is the difference between the eigenvector that I found that corresponde to to the eigenvalue of 1 and the eigenvector that is found after elevating the matrix many times, and that the website described also as the eigenvector of eigenvalue 1.

Best Answer

What do you mean by "the eigenvector"? For each eigenvalue, there are infinitely many eigenvectors, and you typically get a basis as "the" eigenvectors.

From your post, it looks like the general eigenvector for $\lambda=1$ has the form $(t,t,t)$ and you obtained the one with $t= -0.57735$. But other values of $t$ lead to other eigenvectors, and if I remember right, the power method produces a probability eigenvector, thus in this case $t=\frac{1}{3}$.

P.S. Also note, if you are reffering to the first 3x3 example on that page (it would had been very helpfull if you included the matrix you calculated), you should note that they find the eigenvector for $**A^T**$. In that example, it is trivial to see that $(t,t,t)$ is an eigenvector for $A$, but they don't calculate the eigenvectors of $A$!