[Math] Are “most” operators on an infinite-dimensional complex Banach space “diagonalizable”

banach-spacesfa.functional-analysisoperator-theory

This is true for finite-dimensional spaces: the diagonal operators on a finite dimensional complex vector space form contain a dense open set and the nondiagonalizable operators have measure 0.

To be precise, let $T$ be an operator on a complex Banach space $X$ which is not finite-dimensional. For each $\lambda \in \mathbb{C}$, let $V_\lambda \subseteq X$ be the subspace $\mathrm{ker} (\lambda I – T)$ on which $T$ acts by the scalar $\lambda$. Say that $T$ is diagonalizable if $\sum_\lambda V_\lambda$ is dense in $X$. Or provide a better definition if this one is deficient!

I want to know to what extent the "typicality" of diagonalizable operators carries over to infinite dimensions. Are the diagonal operators dense? Are they open (or do they contain an open set)? Are they comeagre? Of course, this will probably depend on the Banach space and the operator topology. I suppose it's natural to consider just bounded operators, although I'd be interested in results about unbounded operators, too. Also, the question makes perfect sense for any topological vector space; I'm interested in non-Banach spaces, too.

I asked this question on math stackexchange, and Mateusz Wasilewski pointed out that the Weyl – von Neumann – Berg theorem shows that on separable Hilbert space, the "orthogonally diagonalizable" operators (where the $V_\lambda$ are required to be orthogonal) are dense among normal operators (in the norm topology).

Best Answer

Consider the right shift $R([x_1, x_2, \ldots]) = [0, x_1, x_2, \ldots]$ on $\ell^2$. I claim the open ball of radius $1/2$ about $R$ contains no diagonalizable operators.

Let $e_1 = [1,0,\ldots]$. Suppose $B$ is an operator with $\|B\| < \epsilon < 1/2$. For any $x$ we have $\|(R + B)x \| \ge (1-\epsilon) \|x\|$, and so $$ \langle e_1, (R+B) x\rangle = \langle e_1, B x \rangle \le \|B x\| \le \epsilon \|x\| \le \dfrac{\epsilon}{1-\epsilon} \|(R+B)x\| $$ Since $\epsilon/(1-\epsilon) < 1$, this implies that $e_1$ is not in the closure of $\text{Ran}(R+B)$. Now any eigenvector of $R+B$ is in $\text{Ran}(R+B)$ (note that $0$ is not an eigenvalue because $\|(R+B)x\| \ge (1-\epsilon)\|x\|$), so the span of the $V_\lambda$ for $R+B$ is not dense.

Related Question