Real Analysis – Inductive Proof for the Heine-Borel Theorem in $\mathbb{R}^n$

inductionmetric-spacesreal-analysissolution-verification

The Heine-Borel theorem for $\mathbb{R}^n$ states that a subset $K\subset\mathbb{R}^n$ is closed and bounded if and only if it is compact. (A set is said to be compact if every open cover of the set has a finite sub-cover). The standard method to show that closed- and boundedness implies compactness starts by enclosing $K$ in a box, and continually bisecting it until some or other contradiction is obtained. I have an alternative approach, but I am not entirely sure it works, and I would like to know if it seems possible to extend this to an arbitrary metric space (by replacing closed and bounded by sequentially compact) beyond that, I think its interesting that an inductive proof for it exists, so here it is:

Firstly, it is sufficient to prove that every closed ball centered at $0$ is compact, since one can always enclose a bounded set $K$ inside such a closed ball of radius $r$, $U(r)$, and append to any cover of $K$ the set $\mathbb{R}^n \setminus K$, and finding a finite subcover of the new cover is then equivalent to finding a finite subcover of the original cover.

The base case is trivial since $\mathbb{R}^{0}=\{0\}$ has only compact subsets. If we then assume that this theorem holds in $\mathbb{R}^k$, we will show that it holds in $\mathbb{R}^{k+1}$.

Now let $U(r)\subseteq \mathbb{R}^{k+1}$ be the closed ball of radius $r$ centered at the origin, and let $G$ be a cover for $U(r)$. Then we say that $l\in \mathbb{R}^+$ is reached if there is some finite subset $G'$ of $G$ such that

$$U(l) \subseteq \bigcup G' $$

Now define $w$ as supremum of those $l\leq r$ which are reached. Now we assume that $w<r$ and obtain a contradiction. Note that since

$$\partial U(w) \subseteq U(r) \subseteq \bigcup G $$

We have that $\partial U(w)$ is covered by $G$. But $\partial U(w)$ is a $k$-dimensional sphere, so it's upper (inclusive) hemisphere is homeomorphic to the closed ball in $\mathbb{R}^k$, which is compact by assumption, so there is a finite subset of $G$ which covers the upper hemisphere of $\partial U(w) $, the same is of course true for the lower hemisphere, so there is a finite subset of $G$, call it $G_0$, which covers $\partial U(w) $. It is then easy to show that $G_0\cup G'$, where $G'$ is a finite subset of $G$, covers an open ball with a radius larger than $w$ (the proof for this is included at the end) and thus it covers a closed ball with a radius larger than $w$, which is a contradiction, since $w$ was assumed to be the supremum of reached radii.

Some of my questions are, is this a known proof? Is it correct?

Proof that an open ball of radius larger than $w$ is covered by a finite subset of $G$:

To see why, it is sufficient to show that there is some $\delta>0$ such that for any $x\in\partial U(w)$, the open ball of radius $\delta$ is contained in some member of $G_0$. Suppose this is not true, then for any $\delta>0$ there is some $x\in\partial U(w)$ such that the open ball of radius $\delta$ is not contained in any member of $G_0$, we can in particular take the convergent sequence of $\delta_n=\frac{1}{n}$, which induces a sequence $(x_n)$ with elements in $\partial U(w)$, but because of sequential compactness, this sequence converges to some point$x_0\in\partial U(w)$ but this is a contradiction since $x_0$ is contained in some open ball of radius $\delta_0$ (because $G_0$ is an open cover for $\partial U(w)$) and so eventually the $x_n$'s will be within $\frac{\delta_0}{2}$ of $x_0$, but then each $x_n$ beyond this point is contained in an open ball of radius $\frac{\delta_0}{2}$, a contradiction, so there is some $\delta>0$ such that every point in $\partial U(w)$ is contained in the open ball of radius $\delta$ around that point. A routine application of the triangle inequality then shows that for any $|\varepsilon|\leq \frac{\delta}{2}$ we have

$$\partial U(w+\varepsilon) \in \bigcup G_0$$

Since there is some finite subset $G'\subseteq G$ such that

$$U\left(w-\frac{\delta}{2}\right)\subseteq \bigcup G' $$

So $G'\cup G_0$ covers $U(w+\frac{\delta}{2})$ which is the contradiction we required in the original proof

Best Answer

You correctly state

The standard method to show that closed- and boundedness implies compactness starts by enclosing $K$ in a box, and continually bisecting it until some or other contradiction is obtained.

Your alternative proof is absolutely correct. Its essential part is Proof that an open ball of radius larger than $w$ is covered by a finite subset of $G$. This is based on the fact that in metric space compactness is equivalent to sequential compactness. Although this is true, it seems to be a bit unsatisfactory because you invoke an argument of a new provenance ( not an open-cover-argument). We shall see later that this is essentially a variant of the tube lemma.

The tube lemma is easily proved by considering open covers. It is used to prove (inductively) that finite products of compact spaces are compact.

A variant of your proof is to show that that cubes $[-r,r]^n$ are compact. Step 1 is to prove that $[-r,r]$ is compact, step 2 is to use induction.

There are various proofs of the compactness of closed intervals $[a,b]$, and one of these proofs uses your "reached" argument which is very easy in this case. See How to prove every closed interval in R is compact?

Let us modify your proof and avoid the use of a sequential compactness argument.

$\partial U(w)$ is a $k$-dimensional sphere, and you correctly argue that it is compact by assumption (it is the union of upper and lower hemisphere which are homeomorphic to the closed ball in $\mathbb R^k$). Thus $\partial U(w)$ is covered by a finite subset $G_0$ of $G$. The union of the elements in $G_0$ is an open neighborhood $U$ of $\partial U(w)$. We wish to show that $U$ contains a spherical shell $S(w,\delta) = \{ x \in \mathbb R^{k+1} \mid w -\delta < \lVert x \rVert < w + \delta \}$. If we know this, we pick $l$ such that $w -\delta < l < w$. Then $U(l)$ is covered by a finite subset $G_l$ of $G$. The set $G' = G_0 \cup G_l$ is finite and covers $U(w+\delta)$ which contradicts the definition of $w$.

How to find $\delta$? Consider the homeomorphism $$h : \mathbb R^{k+1} \setminus \{0\} \to S^k \times (0,\infty), h(x) = \left(\frac{x}{\lVert x \rVert}, \lVert x \rVert\right)$$ whose inverse is given by $$h^{-1}(y,t) = ty.$$ Then $h(U)$ is an open neighborhood of $S^k \times \{w\}$. By the tube lemma there is $\delta > 0$ such that $V = S^k \times (w-\delta, w+\delta) \subset h(U)$. Then $h^{-1}(V) \subset U$ is the desired spherical shell $S(w,\delta)$.