[Math] Eigenvalue of linear operator iff eigenvalue of matrix representation.

eigenvalues-eigenvectorslinear algebralinear-transformationsmatrices

I'm trying to prove the following theorem, which seems straightforward enough, but I'm confused about the wording and proving the converse:

Let T be a linear operator on a finite-dimensional vector space V, and let B be an ordered basis for V. Prove that t is an eigenvalue of T iff t is an eigenvalue of the matrix representation of $T_B$. (Sorry; I don't know how to make the subscript of B with [T].)

Here's my confusion: don't we have to have that V is a vector space consisting of column vectors in order for us to even speak of eigenvalues and eigenvectors of matrices? This isn't specified in the theorem.

Moreover, even if this were specified, I'm not sure how to get from the matrix multiplication form back to the linear operator form with my proof. Any suggestions?

Best Answer

There is no need to have column vectors or the such. Basically, what it's asking is the following: if you have an abstract vector space $V$ and a linear transformation $T:V\to V$ you can fix a basis $B$ of $V$. This basis allows you to form an isomorphism $V\cong F^n$, and under this isomorphism $T$ becomes a new linear transformation, in matrix form if you like. It's asking you to show that $T$ has the eigenvalue $\lambda$ (there exists a non-trivial solution to $T-\lambda\text{id}$ if and only if the transformed matrix transformation has that same eigenvalue.

My hint goes as follows: any two matrix realizations of the same linear transformation are just conjugates (they are similar) to one another. Since it's clear that having the eigenvalue $\lambda$ is invariant under similarity, that it suffices to prove this for a single basis $B$. There is a pretty obvious one that should make this easy (consider a basis containing $x$ where $x$ is an eigenvector for $\lambda$).

Related Question