Exercise 5.A.11 in Linear Algebra Done Right, 4th edition, Sheldon Axler.

eigenvalues-eigenvectorslinear algebralinear-transformations

This is exercise 11 which comes from Linear Algebra Done Right, Section 5A, 4th edition, Sheldon Axler.

Suppose $V$ is finite-dimensional, $T \in \mathcal{L}(V)$, and $\alpha
\in \mathbf{F}$
. Prove that there exists $\delta > 0$ such that $T -\lambda I$ is invertible for all $\lambda \in \mathbf{F}$ such that $0 < |\alpha – \lambda| < \delta$.

Here, $\mathbf{F}$ denotes either the set of real numbers or complex numbers. I'm not sure where to start with this exercise. Here's a few things I know, but don't know how to use for this problem:

  • $\lambda$ is not an eigenvalue $\iff T -\lambda I$ is
    invertible
  • $T$ has at most $\text{dim} \ V$ distinct eigenvalues.

Could someone provide a hint to get me started?

Best Answer

This has little to do with linear algebra. There are only finitely many eigenvalues, so if there are any eigenvalues distinct from $\alpha$, there is one such eigenvalue$~\mu$ with $|\alpha-\mu|$ minimal among them. Then if you take $\delta$ to be that minimal absolute value, there are by construction no eigenvalues$~\lambda$ with $0<|\alpha-\lambda|<\delta$. Of course if there are no eigenvalues distinct from $\alpha$ at all, you can take whatever positive $\delta$ you like,