Functional Analysis – $c_{0}$ Has No Boundedly Complete Basis

banach-spacesfa.functional-analysis

Recall that a basis $(x_{n})_{n}$ for a Banach space $X$ is called boundedly complete if for every scalar sequence $(a_{n})_{n}$ with $\sup_{n}\|\sum_{i=1}^{n}a_{i}x_{i}\|<\infty$, the series $\sum_{n=1}^{\infty}a_{n}x_{n}$ converges. It is well-known that $c_{0}$ has no boundedly complete basis. My question is to give a quantitative version of the fact. More precisely, for a bounded sequence $(x_{n})_{n}$, we set $$\textrm{ca}((x_{n})_{n})=\inf_{n}\sup_{k,l\geq n}\|x_{k}-x_{l}\|.$$ Then $(x_{n})_{n}$ is norm-Cauchy if and only if $\textrm{ca}((x_{n})_{n})=0$.

Let $(x_{n})_{n}$ be a basis for a Banach space $X$. We set
$$\textrm{bc}((x_{n})_{n})=\sup\Big\{\textrm{ca}((\sum_{i=1}^{n}a_{i}x_{i})_{n})\colon
(\sum_{i=1}^{n}a_{i}x_{i})_{n}\subseteq B_{X}\Big\},$$
where $B_{X}$ is the closed unit ball of $X$. Clearly, $(x_{n})_{n}$ is boundedly complete if and only if $\textrm{bc}((x_{n})_{n})=0$.

I have proved that the $\textrm{bc}$-values of the unit vector basis and the summing basis in $c_{0}$ are both equal to $1$. Therefore, I have the following question.

Question. $\textrm{bc}((x_{n})_{n})\geq 1$ for every basis $(x_{n})_{n}$ in $c_{0}$ ?

Thank you!

Best Answer

Proposition. Let $(x_n)$ be a basis for a Banach space $(X,\|\cdot\|_X)$ that is not boundedly complete. Then $bc(x_n) \ge 1$.

Proof: Let $(Z,\|\cdot\|_Z)$ be the Banach space of all sequences $A=(a_n)$ of scalars for which $$ \|A\|_Z = \sup_N \|\sum_{n=1}^N a_n x_n\|_X < \infty. $$ For $N =1,2,...$, define the operator $R_N$ on $Z$ by letting the first $N-1$ coordinates of $R_N(A)$ be zero and the other coordinates to agree with the coordinates of $A$ (so $R_1(A)= A$). Let $A$ be any element of $Z$ s.t. $$ !A! := \limsup_N \|R_N(A)\|_Z >0. $$ That such an $A$ exists is equivalent to the statement that $(x_n)$ is not boundedly complete.

Now if $\|A\|_Z$ were less than or equal to $ !A!$, then we would be done. That need not be true, but to complete the proof it is enough to show that $$ \liminf_N \|R_N(A)\|_Z -!R_N(A)! \le 0, $$ which is obvious from the definition of $!\cdot!$ and the observation that for every $N$, $!R_N(A)! = !A!$.

Added 4/15/22.

Look at it this way. Since $!A! > 0$, divide by $!A!$ to see that WLOG $!A! = 1$. Now maybe $\|R_1(A)\|_Z$ is much larger than one, but there is $N_0$ so that for all $N>N_0$, $\|R_N(A)\|_Z$ is less than $1+\epsilon$. By replacing $A$ with $R_{N_0}(A)$, we can assume WLOG that $\|R_N(A)\|_Z$ is less than $1+\epsilon$ for all $N$ ("WLOG" because $!A!=!R_N(A)!$ for all $N$). This means that ALL partial sums $\sum_{n=k}^j a_n x_n$ in $X$ with any starting point have norm at most $1+\epsilon$, but arbitrarily far out you have partial sums $\sum_{n=k}^j a_n x_n$ that have norm in $X$ at least $1-\epsilon$.