The definition of the subspace of $V$ generated by all eigenvectors of $A$ having $\lambda$ as eigenvalue? (Serge Lang “Linear Algebra 3rd”)

eigenvalues-eigenvectorslinear algebra

I am reading "Linear Algebra 3rd Edition" by Serge Lang.

Theorem 1.1. Let $V$ be a vector space and let $A:V\to V$ be a linear map. Let $\lambda\in K$. Let $V_\lambda$ be the subspace of $V$ generated by all eigenvectors of $A$ having $\lambda$ as eigenvalue. Then every non-zero element of $V_\lambda$ is an eigenvector of $A$ having $\lambda$ as eigenvalue.

In this book, the author defined the subspace generated by $v_1,\dots,v_n$ as follows:
Let $V$ be an arbitrary vector space, and let $v_1,\dots,v_n$ be elements of $V$.
Let $W$ be the set of all linear combinations of $v_1,\dots,v_n$.
Then $W$ is called the subspace generated by $v_1,\dots,v_n$.

But the set of all eigenvectors of $A$ having $\lambda$ as eigenvalue is an infinite set.

What is the definition of the subspace of $V$ generated by all eigenvectors of $A$ having $\lambda$ as eigenvalue?

In general, what is the definition of the subspace of $V$ generated by an infinite set?

Best Answer

The subspace generated by ANY subset of $V$ is of the form

$$\{\sum_{i}^{finite}c_{i}s_{i}:s_{i}\in S , c_{i}\in\mathbb{F}\}$$.

An alternative way to think of this would be to follow the footsteps of someone who is trying to constuct such a set. You first pick a natural number $n$. Then you pick from the possible infinite set $S$, $n$ such elements ( they are not necessarily distinct, you can pick same things again and again) and also pick $n$ field elements $c_{i}$ in a similar way. then you consider the sum $\sum_{i=1}^{n}c_{i}s_{i}$. All such finite sums you can come up with in this way will be inside the span of $S$ and is precisely the definition of span of $S$. Wait, I'll edit it in the answer.

Now you can apply Zorn's Lemma to find a basis for this subspace. Now this basis might be infinite which would again result in an infinite set but possibly a less complicated one.

But in most applications you will deal with finite dimensional vector spaces . In that case this is how you proceed to find an eigen basis for the subspace.

Say $\lambda$ is an eigen value of $A$. Then you look at the matrix representation of $A$ (if indeed A is a matrix then you just look at the matrix A). Now you compute the null space of $(A-\lambda I)$. i.e. in practical terms , you look at the matrix $A-\lambda I$ . And then look for the basis of the solution space . That is you are looking for $x\in V$ such that $(A-\lambda I)x=0$. This is precisely saying that you look at the rows of $A$ and regard them as equations and then find the solution of those equation. So what you do is look at the set of solutions and then find the maximal linearly independent subset and that will be your eigen basis. If these sound too complicated, here is an example. This is really very easy and probably even taught at high school in an implicit way. Take any matrix

$$A=\begin{bmatrix} 1 & 2&0 \\ 0 & 3&2\\ 0& 0& 1 \end{bmatrix}$$ .

Then what is the eigen values of this matrix? . They are precisely $1$ and $3$ right. (Here is another shortcut....the eigen values of a triangular matrix is it's diagonal entries).

Now say you want to compute the eigen space corresponding to $\lambda =1$. So what you do is you look at $A-\lambda I=A-I$.

$$A-I=\begin{bmatrix} 0 & 2&0 \\ 0 & 2&2\\ 0& 0& 0 \end{bmatrix}$$.

Then what is the system of equations corresponding to this?. i.e if you are trying to find $X=(x,y,z)^{T}$ such that $(A-I)X=0$. Then what is the system of equation corresponding to $x,y,z$?.

It is precisely $2y=0$ and $2y+2z=0$. Which gives $y=0,z=0$. So the eigen space is nothing but vectors of the form $\{(x,0,0):x\in \mathbb{F}\}$. Now if you watch closely this is nothing but $\{x(1,0,0):x\in\mathbb{F}\}=span\{(1,0,0)\}$. This is how you do . Similarly you can compute the eigen space of $\lambda=3$.

Remember these tricks , they will help you a lot.

Related Question