Let $X,Y$ be real Banach spaces, and let $B(X,Y)$ be the space of bounded linear operators.
Given $T \in B(X,Y)$ the modulus of $T$ is defined to be
$$
\gamma(T):=\inf \{ \,\|Tx\| \, \, | \, \, d(x,\ker T)=1 \}.
$$
It is known that if the image of $T$ is closed in $Y$, then $\gamma(T)>0$.
Suppose that:
- $T_n \in B(X,Y)$ is a sequence of operators with closed images, and that $T_n \to T$ in the operator norm.
- $T$ has a closed image.
- $\dim \ker T_n=\dim \ker T< \infty$. (All the kernels are finite-dimensional and of the same dimension).
Is it true that $\gamma(T_n) \ge c $ for some $ c >0$ independent of $n$?
If the modulus was a continuous map $B(X,Y) \to \mathbb{R}$, then we had $\gamma(T_n) \to \gamma(T)>0$. So the answer would be positive. However, I am not sure the modulus is continuous. Nonetheless, I am interested in the weaker result – must the modulus of a convergent sequence be bounded?
Best Answer
First, observe that in order to prove this statement, it suffices to find a subsequence of $(T_n)$ on which $\gamma(T_n)$ is bounded away from $0$. If the statement were false, then we could find a subsequence of $(T_n)$ on which $\gamma(T_n)\to 0$, and then applying our argument starting with that subsequence, we would reach a contradiction.
Now let us prove a lemma.
Now, let $r=\dim \ker T$. For each $n$, choose a basis $x^1_n,\dots,x^r_n$ of $\ker T_n$ which is "almost orthonormal" in the sense that for each $i$, $x^i_n$ is a unit vector and $d(x^i_n,\operatorname{span}(x^1_n,\dots,x^{i-1}_n))\geq 1/2$. By the lemma, we may pass to a subsequence and assume that for each $i$, $(x^i_n)$ converges to some $x^i\in\ker T$. These $x^i$ will also be almost orthonormal, and so in particular will be linearly independent and thus a basis of $\ker T$.
Now suppose that $\gamma(T_n)\to 0$. This means that we can choose $y_n\in X$ such that $d(y_n,\ker T_n)=1$ for each $n$ but $T_ny_n\to 0$. Modifying $y_n$ by an element of $\ker T_n$, we may assume that $(y_n)$ is bounded. By the lemma, we may pass to a subsequence and assume that $(y_n)$ converges to some $y\in\ker T$. Writing $y$ as a linear combination of $x^1,\dots,x^r$, we see that for large $n$, $y_n$ is close to the corresponding linear combination of $x^1_n,\dots,x^r_n$. This contradicts the assumption that $d(y_n,\ker T_n)=1$.
Thus $\gamma(T_n)$ does not converge to $0$. This means we can pass to a subsequence and conclude that $\gamma(T_n)$ is bounded away from $0$, as desired.