- For a complex square matrix $M$, a maximal set of linearly
independent eigenvectors for an eigenvalue $\lambda$ is determined
by solving $$ (M – \lambda I) x = 0. $$ for a basis in the
solution subspace directly as a homogeneous linear system. - For a complex square matrix $M$, a generalized eigenvector for an
eigenvalue $\lambda$ with algebraic multiplicity $c$ is defined as
a vector $u$ s.t. $$ (M – \lambda I)^c u = 0. $$ I wonder if a
generalized eigenbasis in Jordan decomposition is also determined by
finding a basis in the solution subspace of $(M – \lambda I)^c u =
0$ directly in the same way as for an eigenbasis? Or it is more
difficult to solve directly as a homogeneous linear system, and some
tricks are helpful?
Thanks!
Best Answer
Look at the matrix $$M=\pmatrix{1&1\cr0&1\cr}$$ Taking $\lambda=1$, $c=2$, Then $(M-\lambda I)^c$ is the zero matrix, so any two linearly independent vectors will do as a basis for the solution space of $(M-\lambda I)^cu=0$. But that's not what you want: first, you want as many linearly independent eigenvectors as you can find, then you can go hunting for generalized eigenvectors.