[Math] Proof that a is an eigen value of p(T) if and only if a=p(lambda) for some eigenvalue lambda of T

eigenvalues-eigenvectorslinear algebrapolynomialsproof-verification

$\newcommand{\N}{\mathbb{N}}
\newcommand{\Z}{\mathbb{Z}}
\newcommand{\F}{\mathbb{F}}
\newcommand{\C}{\mathbb{C}}
\newcommand{\LM}{\mathcal{L}}$
Question: Suppose $\F = \C, T \in \LM(V), p \in \mathcal{P}(\C))$ and $a \in \C$. Prove that $a$ is an eigenvalue of $p(T)$ if and only if $a = p(\lambda)$ for some eigenvalue $\lambda$
of $T$.

My proof so far:

Suppose $a=p(\lambda)$ for some eigenvalue $\lambda$ of $T$ and $v \in V$. Then, $p(T)v=a_0Iv+a_1Tv+\cdots+a_nT^nv$. Since $a=p(\lambda)$,
$a=a_0+a_1\lambda+\cdots+a_n\lambda^n$ for $a_0,\ldots,a_n \in \C$. Then
\begin{align*}
& (p(T) – aI)v=p(T)v-av\\
= {}& a_0Iv+a_1Tv+\cdots+a_nT^nv-(a_0+\cdots+a_n\lambda^n)v\\
= {} & (a_0Iv-a_0v)+(a_1Tv-a_1\lambda v)+\cdots+(a_nT^nv-a_n\lambda^nv)\\
= {}& 0
\end{align*}
Therefore, $a$ is an eigenvalue of $p(T)$.

Suppose $a$ is an eigenvalue of $p(T)$ and $v \in V$. Then
\begin{align}
0 & = (p(T) – aI)v \\
& =(a_0I+a_1T+\cdots+a_nT^n-aI)v \\
& =(a_0I+a_1T+\cdots+a_nT^n-aI)v
\end{align}

By subtracting from the right side,
\begin{align*}
a_0v+a_1Tv+\cdots+a_nT^nv & =av\\
(a_0-a)v+a_1Tv+\cdots+a_nT^nv & =0
\end{align*}

I see that $av$ is a multiple of $v$, which is the operand of $(p(T)-aI)$, so $(p(T)-aI)$ is invariant. Is this sufficient to show that an eigenvalue exists?

Best Answer

Your first direction is correct, but has a confusing typo: You started by saying "Suppose $p(a)=\lambda$" when you meant to say "Suppose $a=p(\lambda)$."

In your second direction, you can do this: Suppose $a$ is an eigenvalue of $p(T)$ with eigenvector $v$. Then indeed (as you started):
$(p(T) - aI)v = 0$.

However, you need to do more than this. Note that you can always factor polynomials over their complex roots. So for the polynomial $p(x)-a$, there is a number $c \neq 0$ and roots $r_1, r_2, \ldots, r_n$ such that:
$p(x)-a = c(x-r_1)(x-r_2)...(x-r_n)$.

In particular, $p(r_i)-a=0$ for all roots $r_i$. Applying this polynomial form to the equation $(p(T)-aI)v=0$ gives:
$(p(T)-aI)v = c(T-r_1I)(T-r_2I)…(T-r_nI)v = 0$.

Now you just need to show that one of the roots $r_i$ is an eigenvalue for $T$ (with eigenvector possibly different from $v$). This is similar to the standard proof that $T$ has an eigenvalue over the complex numbers, such as that given in the Sheldon Axler textbook Linear Algebra Done Right.

Related Question