[Math] Is it true that every eigenvalue has at least one eigenvector

eigenvalues-eigenvectorslinear algebramatrices

As mentioned above: Is it true that every eigenvalue has at least one eigenvector?

Or is it possible that while trying to find the basis of a specific eigenspace, i will get only the zero vector (means there are no eigenvectors corresponding to this eigenvalue)?

Thank you

Best Answer

I assume you are talking of eigenvalues and eigenvectors of an $n\times n$ square matrix $A$.

Define an eigenvalue to be a root of the polynomial $|\lambda I-A|=0$. Then $\lambda I-A$ has determinant $0$. So when you row-reduce it, there will be a row of zeros. There will be at most $n-1$ pivots, so one column lacks a pivot - it is a free variable. So there are non-zero solutions to $(\lambda I-A)v=0$, or $Av=\lambda v$.