[Math] Prove that $\ell ^1(\mathbb{N})$ is a Banach space.

banach-spaceslp-spacesreal-analysis

I'm trying to prove that $\ell^1(\mathbb{N}) := \left\{ (x_n)_{n=1}^{\infty} : \sum\limits_{n=1}^{\infty}\left|x_n\right| < \infty \right\} $, the space of all sequences over the field $\mathbb{C}$ that converge absolutely, is a Banach space with respect to the norm $\left|\left|(x_n)\right|\right|_{1} := \sum\limits_{n=1}^{\infty}\left|x_n\right|$.

Proof thus far:

Suppose $(x_n)\in L^1(\mathbb{N})$ and let $X^{(l)} := \left\{(x_{n})_{l}\right\}^{\infty}_{l=1} \subset \ell^1(\mathbb{N}) $ denote a Cauchy sequence.

Therefore, $\forall ~ \epsilon > 0, \exists ~ N \in \mathbb{N} : \left|\left| X^{(l)} – X^{(m)}\right|\right|_{1} < \epsilon \space\space \forall ~ l,m \ge N $.

By definition, $\left|\left| X^{(l)} – X^{(m)}\right|\right|_{1} = \sum\limits_{k=1}^{\infty} \left|x^{(l)}_k – x^{(m)}_k\right| \le \epsilon$

So, $\left| \sum\limits_{k=1}^{\infty} \left(x^{(l)}_k – x^{(m)}_k\right) \right| \leq \sum\limits_{k=1}^{\infty} \left|x^{(l)}_k – x^{(m)}_k\right| \le \epsilon$

We then have, $\forall ~ \epsilon > 0, \exists ~ N \in \mathbb{N} : \left| \sum\limits_{k=1}^{\infty} x^{(l)}_k – x^{(m)}_k\right| < \epsilon \space\space \forall ~ l,m \ge N $

The summation is over each element of the Cauchy sequence $X^{(l)}$, which are themselves Cauchy sequences in $\mathbb{C}$ and so therefore must converge (since $\mathbb{C}$ is a Banach space).

I'm not sure where to go from here though, how to relate that back to the original Cauchy sequence in $L^1(\mathbb{N})$ to show that it converges. Or even what I've done already is even correct.

Best Answer

Here's the start of the answer. Let $\displaystyle \left\lbrace ( x^{(j)}_n )_{n = 1}^{\infty} \right\rbrace_{j = 1}^{\infty}$ be the Cauchy sequence of $L^1 (\mathbb{N})$ (usually called $\ell^1$) sequences, so that

$$\forall \epsilon > 0, \exists \, J :\, j, k \geq J \text{ implies } \Vert(x^{(j)}_n) - (x^{(k)}_n)\Vert_1 = \sum_{n = 1}^{\infty} \vert x^{(j)}_n - x^{(k)}_n \vert < \epsilon.$$

Claim: We know what the limit sequence must be, termwise.

Proof: Fixing a specific index $m$, we clearly have that $\vert x^{(j)}_m - x^{(k)}_m \vert \leq \sum_{n = 1}^{\infty} \vert x^{(j)}_n - x^{(k)}_n \vert$, so that we conclude:

$$\forall \epsilon > 0, \exists \, J :\, j, k \geq J \text{ implies } \vert x^{(j)}_m - x^{(k)}_m \vert \leq \sum_{n = 1}^{\infty} \vert x^{(j)}_n - x^{(k)}_n \vert < \epsilon.$$

Hence, $(x_m^{(j)})_{j = 1}^{\infty}$ is a Cauchy sequence of complex numbers, hence converges. We define the limiting value as $$x^{(\infty)}_m : = \lim_{j \to \infty} x_m^{(j)}.$$

Of course, we can repeat for each $m \in \mathbb{N}$. Therefore, our goal is to show that the termwise/"pointwise" limit $(x_n^{(\infty)})_{n = 1}^{\infty}$ is an $L^1(\mathbb{N})$ sequence. EDIT Also, we need to show that that $\lbrace (x^{(j)}) \rbrace_{j = 1}^{\infty}$ converges to it. TIDE $\square$

The remainder is a fairly standard "split the error into digestible pieces each less than $\displaystyle \frac{\epsilon}{2}$ or $\displaystyle \frac{\epsilon}{3}$ or whatever the fraction is," similar to the proof that the uniform limit of continuous functions is continuous. EDIT This paragraph is too simplistic. See below. TIDE Do you require further assistance, or can you take it from here?

P.S.: This method does not really generalize that well outside sequence spaces. The general proof of completeness of $L^1(\mathbb{R})$ requires some use of the Dominated Convergence Theorem, and the fact that completeness is equivalent to "absolute convergence of series implies convergence of series."


EDIT: The middle of the proof. Our goal here is to show that the "limit object" is in the space of $\ell^1$ sequences.

Lemma 1: $M: = \sup_j \Vert (x_n^{(j)}) \Vert_1 < \infty$. (i.e., "Cauchy sequences are bounded").

Proof: Fix $\epsilon = 1$. Then by Cauchy-ness, $$\exists J: j, k \geq J \, \text{ implies } \, \Vert (x_n^{(j)}) - (x_n^{(k)}) \Vert_1 < 1.$$ Fixing $j = J$, then we have that for all $k \geq J$, $$ \begin{align} \Vert (x^{(k)}_n) \Vert_1 & \leq \Vert (x_n^{(k)}) - (x_n^{(J)}) \Vert_1 + \Vert (x_n^{(J)}) \Vert_1 \\ & \leq 1 + \Vert (x_n^{(J)}) \Vert_1 \end{align}$$ Thus, it is clear that the larger of $\displaystyle \max_{1 \leq k < J} \Vert (x_n^{(k)}) \Vert_1$ and $1 + \Vert (x_n^{(J)}) \Vert_1 $ is an upper bound on all $ \Vert (x_n^{(k)}) \Vert_1.$ Hence, the supremum is finite. $\square$

Our next trick is to show that all the partial sums are uniformly bounded.

Claim: For all $N \in \mathbb{N}$, $$S_N := \sum_{n = 1}^N \vert x_n^{(\infty)} \vert \leq M + 1.$$

Proof: Fix $N \in \mathbb{N}$. By the Triangle Inequality, for any choice of $j$, $$ \sum_{n = 1}^N \vert x_n^{(\infty)}\vert \leq \sum_{n = 1}^N \vert x_n^{(\infty)} - x_n^{(j)} \vert + \sum_{n = 1}^N \vert x_n^{(j)}\vert.$$

We need to choose $j$ so that both sums are bounded.

The first sum: Note that for all $n \in \lbrace 1, 2, 3, \dotsc, N \rbrace$, we know that $\displaystyle x^{(\infty)}_n = \lim_{j \to \infty} x_n^{(j)}$. Hence, fixing $\displaystyle \epsilon = \frac{1}{N}$,

$$ \exists J_n : j \geq J_n \text{ implies that } \vert x_n^{(\infty)} - x_n^{(j)} \vert < \frac{1}{N}.$$ Taking $\displaystyle J^* := \max_{1 \leq n \leq N} J_n$, we have that $$ j \geq J^* \text{ implies } \sum_{n = 1}^N \vert x_n^{(\infty)} - x_n^{(j)} \vert < \sum_{n = 1}^N \frac{1}{N} = 1. $$ We will, in fact, for convenience take $j = J^*$, and the first sum is bounded by $1$.

The second sum: Clearly $$\sum_{n = 1}^N \vert x_n^{(J^*)} \vert \leq \sum_{n = 1}^{\infty} \vert x_n^{(J^*)} \vert = \Vert (x_n^{(J^*)}) \Vert_1.$$ Yet this sum, by Lemma 1 is bounded by $M$.

Combining the above results, we get that $S_N \leq 1 + M$, as required. $\square$

Claim: $(x_n^{(\infty)})$ is in $L^1(\mathbb{N})$.

Proof Sketch: Clearly $S_N$ is increasing in $N$, yet bounded above independently of $N$. As soon as you see that $S_N$ converges, then by definition of infinite sums of real numbers, $\sum_{n = 1}^{\infty} \vert x_n^{(\infty)} \vert$ exists and equals $\displaystyle \lim_{N \to \infty} S_N$. So $(x_n^{(\infty)})$ will be in $L^1(\mathbb{N})$.

This just boils down to showing that $\displaystyle \lim_{N \to \infty} S_N = \sup_N S_N ( < \infty)$, which is a basic exercise in limits of real numbers. I will not go into details here unless requested. $\square$

Therefore, the desired ``limit object'' is in the space!


Now we must check that $\lbrace (x_n^{(j)}) \rbrace_{j = 1}^{\infty}$ actually converges to it. To do so, we need another lemma, showing that as $j$ gets large, the "tail" of the sums are uniformly small.

Lemma 2: Fix $\epsilon > 0$, and fix $J = J(\epsilon)$ such that $$j, k \geq J(\epsilon) \text{ implies } \Vert(x^{(j)}_n) - (x^{(k)}_n)\Vert_1 = \sum_{n = 1}^{\infty} \vert x^{(j)}_n - x^{(k)}_n \vert < \frac{\epsilon}{2}.$$

Then there exists $N = N(\epsilon, J(\epsilon))$ such that for all $j \geq J(\epsilon)$, \begin{equation}\sum_{n = N + 1}^{\infty} \vert x^{(j)}_n \vert < \epsilon. \tag{*}\end{equation}

Proof: We prove by contradiction. Suppose that the above conclusion fails. Then for all $N$, there exists $k \geq J(\epsilon)$ such that $\sum_{n = N + 1}^{\infty} \vert x^{(k)}_n \vert \geq \epsilon$.

Now, for $j = J(\epsilon)$, $\vert (x^{(J(\epsilon))}) \vert$ is in $L^1(\mathbb{N})$, so there exists $N_1$ such that $$\sum_{n = N_1 + 1}^{\infty} \vert x^{(J(\epsilon))}_n \vert < \frac{\epsilon}{2}.$$

Yet by the failure of the above condition, letting $N = N_1$, there exists $k \geq J(\epsilon)$ such that $$\sum_{n = N_1 + 1}^{\infty} \vert x^{(k)}_n \vert \geq \epsilon$$

Therefore, we see that $$\begin{align} \sum_{n = N_1 + 1}^{\infty} \vert x^{(J(\epsilon))}_n - x^{(k)}_n \vert &\geq \sum_{n = N_1 + 1}^{\infty} \vert x^{(k)}_n \vert - \vert x^{(J(\epsilon))}_n \vert \\ & = \sum_{n = N_1 + 1}^{\infty} \vert x^{(k)}_n \vert - \sum_{n = N_1 + 1}^{\infty} \vert x^{(J(\epsilon)}_n \vert\\ & > \epsilon - \frac{\epsilon}{2} = \frac{\epsilon}{2}. \end{align}$$

Yet $$\sum_{n = N_1 + 1}^{\infty} \vert x^{(J(\epsilon)}_n - x^{(k)}_n \vert \leq \sum_{n = 1}^{\infty} \vert x^{(J(\epsilon)}_n - x^{(k)}_n \vert \, = \Vert (x^{(J(\epsilon)}_n) - (x^{(k)}_n) \Vert_1,$$ and by definition of $J(\epsilon)$, since $j = J(\epsilon)$ and $k$ are at least $J(\epsilon)$, $$\Vert (x^{(J(\epsilon))}_n) - (x^{(k)}_n) \Vert_1 < \frac{\epsilon}{2}.$$

So in short, $$\frac{\epsilon}{2} < \sum_{n = N_1 + 1}^{\infty} \vert x^{(J(\epsilon))}_n - x^{(k)}_n \vert < \frac{\epsilon}{2}.$$ Contradiction. $\square$

[Recommendation: Now you are in a position to do an $\displaystyle \frac{\epsilon}{2}$-style proof. Try doing it yourself before finishing the answer.]

Claim: $(x_n^{(j)})$ converges to $(x_n^{(\infty)})$.

Proof: By Lemma 2, with $\displaystyle \frac{\epsilon}{4}$ in the place of $\epsilon$, there exists $N_1$ such that for all $j \geq J(\epsilon /4)$, $$\sum_{n = N_1 + 1}^{\infty} \vert x^{(j)}_n \vert < \frac{\epsilon}{4}.$$

Since $(x_n^{(\infty)})$ is in $L^1$, there exists $N_2$ such that $$\sum_{n = N_2 + 1}^{\infty} \vert x^{(\infty)}_n \vert < \frac{\epsilon}{4}.$$

Therefore, for all $j \geq J(\epsilon/4)$, and for $N = \max \lbrace N_1, N_2 \rbrace$ $$\begin{align} \sum_{n = N + 1}^{\infty} \vert x^{(j)}_n - x^{(\infty)}_n \vert &\leq \sum_{n = N + 1}^{\infty} \vert x^{(j)}_n \vert + \vert x^{(\infty)}_n \vert \\ & = \sum_{n = N + 1}^{\infty} \vert x^{(j)}_n \vert + \sum_{n = N + 1}^{\infty} \vert x^{(\infty)}_n \vert\\ & \leq \sum_{n = N_1 + 1}^{\infty} \vert x^{(j)}_n \vert + \sum_{n = N_2 + 1}^{\infty} \vert x^{(\infty)}_n \vert\\ & < \frac{\epsilon}{4} + \frac{\epsilon}{4} = \frac{\epsilon}{2} \end{align}$$

Moreover, since $x_n^{(j)} \to x_n^{(\infty)}$ for each fixed $n$ as $j \to \infty$, by similar logic to the "The first sum" section above, there exists $J_2$ such that $j \geq J_2$ implies that $$\sum_{n = 1}^N \vert x^{(j)}_n - x^{(\infty)}_n \vert < \frac{\epsilon}{2}.$$

Therefore, for $j \geq J^* = \max \lbrace J(\epsilon/4), J_2 \rbrace$, $$\begin{align} \Vert (x^{(j)}) - (x^{(\infty)}) \Vert_1 & = \sum_{n = 1}^{\infty} \vert x^{(j)}_n - x^{(\infty)}_n \vert \\ & = \sum_{n = 1}^{N} \vert x^{(j)}_n - x^{(\infty)}_n \vert + \sum_{n = N + 1}^{\infty} \vert x^{(j)}_n - x^{(\infty)}_n \vert \\ & < \frac{\epsilon}{2} + \frac{\epsilon}{2} = \epsilon. \end{align}$$ This works for all $\epsilon > 0$, so we have convergence. $\square$.

Related Question