Functional Analysis – Are (x_n-x_{n-1}) and (x_1+…+x_n) Schauder Basis?

functional-analysisschauder-basis

My friend and I were talking about challenging problems as we prepare for our finals. She suggested three intriguing ones that caught my interest, and I thought of sharing my solutions to them. Feel free to propose alternative solutions.

Let $\{x_n\}_{n\in \mathbb{N}}$ be a Schauder basis in a Banach space.

  1. Find a counterexample to the fact $\{x_n-x_{n-1}\}_{n\in \mathbb{N}}$ is a Schauder basis.

  2. If $\lVert x_n \rVert=1$ and there exists $f\in X^*$ such that $f(x_n)=1$ for all $n$, prove $\{x_n-x_{n-1}\}_{n\in \mathbb{N}}$ is a Schauder basis.

  3. Take $c_o$ the spaces of sequences converging to zero with the $\lVert \cdot \rVert_\infty$ norm. If $x_n$ is the canonical basis, prove $\{x_1+x_2+…+x_n\}_{n\in \mathbb{N}}$ is a Schauder Basis.

Best Answer


Take any separable Hilbert space ($\ell^2$, for example) and a maximal orthonormal set (which must be countable, because our space is separable). This orthonormal set is a Schauder basis. Furthermore we notice that the following series is convergent:

$$S=\sum_{j=1}^\infty \frac{x_j}{j} \leftarrow S_n=\sum_{j=1}^n \frac{x_j}{j}=\frac{1}{n}(x_n-x_{n-1})+\left(\frac{1}{n}+\frac{1}{n-1}\right) (x_{n-1}-x_{n-2})+...+\left(\frac{1}{n}+\frac{1}{n-1}+...+\frac{1}{2}\right)(x_2-x_1)+\left(\frac{1}{n}+\frac{1}{n-1}+...+\frac{1}{2}+1\right)x_1 $$

Suppose by way of contradiction $\{x_n-x_{n-1}\}_{\mathbb{N}}$ is a Schauder basis. Then, projection on the first coordinate should be continuous and we have that:

$$\frac{1}{n}+\frac{1}{n-1}+...+\frac{1}{2}+1=e_1^*(S_n) \rightarrow e_1^*(S)<\infty$$

But this is clearly a contradiction.


  1. Suppose $x_n$ is a Schauder basis. Then for every $x\in X$, we have coefficients $\alpha_n$ such that:

$$x \leftarrow S_n=\sum_{j=1}^n \alpha_j x_j=\alpha_n(x_{n}-x_{n-1})+(\alpha_n+\alpha_{n-1})(x_{n-1}-x_{n-2})+...+(\alpha_n+\alpha_{n-1}+...+\alpha_1)(x_{1}) $$

This is a bit troubling because the coefficients are always changing when we increase $n$. But they are changing in a predictable fashion: increasing $\alpha_n$ terms to be added. Furthermore, because $f$ is continuous, $f(x)=f(\sum_n \alpha_n x_n)=\sum_n \alpha_n$ converges. Thus we are in every right to consider:

$$\hat{S}_n=\left(\sum_{k=1}^\infty \alpha_k\right)x_1+\left(\sum_{k=2}^\infty \alpha_k\right)(x_2-x_1)+...+\left(\sum_{k=n}^\infty \alpha_k\right)(x_n-x_{n-1})$$

Will $\hat{S}_n$ approach $x$ well enough? The answer is positive:

$$\lVert \hat{S}_n-x\rVert=\left\lVert \alpha_1 x_1+\alpha_2x_2+...+\alpha_{n-1}x_{n-1}+\left(\sum_{k=n}^\infty \alpha_k\right)x_n-x\right\rVert\leq \lVert S_{n-1}-x\rVert+\left| \sum_{k=n}^\infty \alpha_k \right|\rightarrow 0$$

We still need to verify there is uniqueness of representation. Suppose $\sum_{n=1}^\infty \alpha_n (x_n-x_{n-1})=\sum_{n=1}^\infty \beta_n (x_n-x_{n-1})$. Because $x^*_n$ is continuous (indeed, the $x_n$ form a Schauder basis) we have that $\alpha_n-\alpha_{n-1}=\beta_n-\beta_{n-1}$. Adding these equation yields a telescopic sum such that:

$$\alpha_n-\alpha_1=\beta_n-\beta_1\quad \forall n\in \mathbb{N}$$

However, by computing $f$ on both sums yields $\alpha_1=f(\sum_{n=1}^\infty \alpha_n (x_n-x_{n-1}))=f(\sum_{n=1}^\infty \beta (x_n-x_{n-1}))=\beta_1$ and this is enough to conclude we really have a Schauder basis.


  1. Take $x\in c_o$ and write it as the following limit:

$$ x \leftarrow S_n=\sum_{j=1}^n \alpha_j x_j= \alpha_n(x_1+...+x_n)+(\alpha_{n-1}-\alpha_n)(x_1+...+x_{n-1})+...+(\alpha_2-\alpha_3)(x_1+x_2)+(\alpha_1-\alpha_2)x_1 $$

Define $\hat{S}_n$ to be equal to:

$$\hat{S}_n=(\alpha_{n}-\alpha_{n+1})(x_1+...+x_n)+(\alpha_{n-1}-\alpha_n)(x_1+...+x_{n-1})+...+(\alpha_2-\alpha_3)(x_1+x_2)+(\alpha_1-\alpha_2)x_1$$

But in this case, we have that:

$$\lVert x-\hat{S_n}\rVert_\infty=\lVert \sum_{j=n+1}^\infty \alpha_j x_{j}+\sum_{j=1}^n \alpha_{n+1}x_j \rVert_\infty=\sup_{j\geq n+1}|\alpha_j|\rightarrow 0$$

To prove uniqueness of representation, suppose $\sum (-\alpha_{j+1}+\alpha_j) \hat{x}_j=\sum \beta_j \hat{x}_j$ where $\hat{x}_j=x_1+...+x_j$. There is $N_o$ such that for $N>N_o$ we may have that both sums are close to $x$ by $\varepsilon/2$:

$$|\beta_j+\beta_{j+1}+...+\beta_N-\alpha_j+\alpha_{N+1}| \leq \left \lVert \sum_{j=1}^N (-\alpha_{j+1}-\alpha_{j}) \hat{x}_j-\sum_{j=1}^N \beta_j \hat{x}_j \right \rVert_\infty<\varepsilon \quad \forall N \geq N_o$$

Because the $\alpha_N$ converge to zero, we actually need that for every $j$:

$$\sum_{k=j}^\infty \beta_k =\alpha_j$$

But in other words this means $-\alpha_{j+1}+\alpha_{j}=\beta_j$ and uniqueness follows.

Related Question