If $\lambda$ is an eigenvalue of $T$, then $ |\lambda| \leq n \ \text{max} \{|\mathcal{M}(T)_{j,k}| : 1 \leq j,k \leq n\} $

eigenvalues-eigenvectorslinear algebralinear-transformationsreal-analysisupper-lower-bounds

The following exercise comes from "Linear Algebra Done Right", Sheldon Axler, 4th Edition, Section 5A, Exercise 16.

Suppose $v_1, \ldots, v_n$ is a basis of $V$ and $T \in \mathcal{L}(V)$. Prove that if $\lambda$ is an eigenvalue of $T$, then
$$
|\lambda| \leq n \ \text{max} \{|\mathcal{M}(T)_{j,k}| : 1 \leq j,k \leq n\}
$$

where $\mathcal{M}(T)_{j,k}$ denotes the entry in row $j$, column $k$ of the matrix of $T$ with respect to the basis $v_1, \ldots, v_n$.

My initial attempt at this exercise followed a similar proof to the Gershgorin circle theorem, but it doesn't lead me anywhere near to the above result.

At this point in the text, we did not cover norms yet, so I cannot use that here. Could someone give a hint in the right direction?

Best Answer

Let $M = \max_{i,j} \vert T_{ij}\vert.$

Take $x$ a vector in $V$ whose coordinates with respect to $v_1,...,v_n$ are $(x_1,...,x_n)'$.

Then

$$ (Tx)_j = \sum_{k = 1 }^n T_{jk} x_k.$$

By the triangular inequality and the definition of M we obtain

$$ \vert (Tx)_j \vert \leq \sum_{k = 1}^n \vert T_{jk} x_k \vert \leq M \sum_{k = 1}^n \vert x_k \vert,$$

which holds for any $j = 1,...,n$. As suggested in the comment consider the special case where $j$ maximises $\vert x_j \vert$ then we have that $\sum_{k = 1}^n \vert x_k \vert \leq n \vert x_j \vert$. Hence

$$ \vert (Tx)_j \vert \leq n M \vert x_j \vert.$$

In the particular case where $x \neq 0$ is an eigenvector for the eigenvalue $\lambda$ then $Tx = \lambda x$ so we have

$$ \vert \lambda \vert \vert x_j \vert \leq n M \vert x_j \vert.$$

If $\vert x_j \vert \neq 0$ we can divide to obtain the inequality we wanted to prove.

If $\vert x_j \vert = 0$ then $x = 0$, a contradiction.

Related Question