How many eigenvector I have

eigenvalues-eigenvectorslinear algebra

I have this matrix:

$$
A=\begin{pmatrix}
1 & 0\\
0 & 2
\end{pmatrix}
$$

with this two eigenvalue:

  • $\lambda_1=1$
  • $\lambda_2=2$

With this solution I found this two(???) eigenvectors:

$$x_1=\begin{pmatrix}
t\\
0
\end{pmatrix}\:\:\:\:\:\:\:\: x_2=\begin{pmatrix}
0\\
t
\end{pmatrix}$$

Now I can say that I have two eigenvectors or infinite eigenvectors? The question born by a video that teach how calculate eigenvectors. But the teacher has confused me. After find the solution he say:

"ANY vector that points along the horizontal axis could be an eigenvector of this system"

But after say:

"so now we have two eigenvalues, and their TWO corresponding eigenvectors"

Best Answer

A vector $x$ is an eigenvector of the matrix $A$ with the eigenvalue $\lambda$ if and only if $Ax = \lambda x$. If one finds such an $x$ for a given $\lambda$, then multiplying any constant $\alpha$ to the equation $Ax = \lambda x$ gives $\alpha Ax = \alpha \lambda x$, so $A (\alpha x) = \lambda (\alpha x)$ since $A$ is linear. Hence $\alpha x$ is also an eigenvector, showing that every vector from linear span of $x$ is actually an eigenvector of $A$ with the same eigenvalue $\lambda$. In that sense, when there is one eigenvector associated to a given eigenvalue, there is always an infinite number of eigenvectors associated with the same eigenvalue.

However, when speaking of eigenvectors, we generally take linearly dependant eigenvectors to be "the same eigenvectors". To get a better understanding of this, it might be more satisfying to think of eigenspaces rather than eigenvectors directly. An eigenspace corresponding to a given eigenvalue $\lambda$ is the subspace consisting of all eigenvectors associated to $\lambda$, and the null vector.

In your example, the matrix $$ A = \begin{pmatrix} 2 & 0 \\ 0 & 1 \end{pmatrix} $$ has two and only two eigenspaces, which are the spans of the vectors $$ \begin{pmatrix} 1 \\ 0 \end{pmatrix} \qquad\text{and}\qquad \begin{pmatrix} 0 \\ 1 \end{pmatrix}. $$ These eigenspaces are both of dimension 1, so for each of them there is only one linearly independent eigenvector.

Related Question