Best approximation to reflexive subspace – converse

functional-analysisnormed-spacesreflexive-space

Let $X$ be a normed vector space.

I know if $X$ is reflexive, then for every normed space $Y$ that $X$ is a subspace of $Y$, $X$ has a best approximation to every $y \in Y$ (which is to say there exists $x\in X$ that $d(y, X) = \|x-y\|$).

I need to prove the converse of the above fact. To be more precise, if for every normed space $Y$ that $X \subset Y$, $X$ has a best approximation to every $y \in Y$, is it true that $X$ is reflexive?

Best Answer

$\newcommand{\dist}{\text{dist}} \newcommand{\R}{\mathbb R} \newcommand{\diam}{\text{diam}} \newcommand{\and}{\quad \text{and} \quad} $The conjecture proposed by the OP is true: every non-reflexive space may be embedded into a larger space $Y$ in such a way that the minimization problem $$ \|y-x_0\| = \inf\{\|y-x\| : x \in X\} $$ admits no solution for some $y$ in $Y$. Proving this conjecture obviously entails constructing the larger space $Y$, so it is necessarily a somewhat long argument.

Lemma. Let $X$ be a normed space and let $C$ be any nonempty convex subset of $X$ whose diameter satisfies $$ \diam(C)\leq 2. $$ Then $X\oplus \R$ becomes a normed space if equipped with the norm $$ \|(x,a)\| = |a| + \dist(x,aC),\quad \forall x\in X,\quad \forall a\in \R. $$ In addition, the correspondence $x\mapsto (x, 0)$ defines an isometry from $X$ onto a closed hyperplane of $X\oplus \R$.

Proof. It is obvious that $\|(x,0)\| = \|x\|$, so the isometry claim is clear. We leave it for the reader to prove that $$ \|\lambda (x, a)\| = |\lambda |\|(x, a)\|, \and \|(x, a)\| = 0 \Rightarrow (x, a)=0, $$ and we concentrate instead in proving the triangle inequality $$ \|(x_1+x_2,a_1+a_2)\| \leq \|(x_1,a_1))\| + \|(x_2,a_2)\|, \tag 1 $$ which we will first verify in the special case that $a_1$ and $a_2$ have the same sign. If both $a_1$ and $a_2$ are zero, the claim follows trivially, so we assume that $a_1+a_2\neq 0$.

Fixing any $\varepsilon >0$, choose $y_i$ in $C$, for each $i=1,2$, such that $$ \|x_i-a_iy_i\|<\dist(x_i,a_iC)+\varepsilon . $$ Then $$ y:= \frac{a_1}{a_1+a_2}y_1 + \frac{a_2}{a_1+a_2}y_2 \tag 2 $$ lies in $C$, whence $$ \dist(x_1+x_2,(a_1+a_2)C) \leq \|x_1+x_2-(a_1+a_2)y\| = $$$$ = \|x_1+x_2-(a_1y_1+a_2y_2)\| \leq \|x_1-a_1y_1\| + \|x_2-a_2y_2\| < $$$$ < \dist(x_1,a_1C) + \dist(x_2,a_2C) + 2\varepsilon . $$ Combining this with the fact that $|a_1+a_2| \leq |a_1| +|a_2|$, we have proven (1).

Let us now prove (1) in the remaining case that $a_1$ and $a_2$ have opposite signs. Unlike the previous case the above argument employing (2) does not work because $y$ will not be in $C$. It is here that the bound on the diameter of $C$ becomes relevant.

Interchanging $(x_1,a_1)$ and $(x_2,a_2)$ if necessary, we may assume that $|a_1|\geq |a_2|$. Furthermore, since we already know that $\|\cdot\|$ is homogeneous, we may multiply $(x_1,a_1)$ and $(x_2,a_2)$ by $-1/a_2$ and hence assume that $a_2=-1$. Finally, given that $a_1$ is positive and $|a_1|\geq 1$, we may write $a_1=1+b$, with $b\geq 0$. Therefore the two elements we are working with may now be represented by $$ (x_1,1+b)\and (-z_2,-1), $$ where we made a last minute parameter change, namely $x_2=-z_2$. Unraveling (1) with these assumptions we see that we need to prove that $$ \dist(x_1-z_2,bC)\leq 2 + \dist(x_1,(1+b)C)+ \dist(z_2,C). \tag 3 $$

Fixing any $\varepsilon >0$, choose $y_1$ and $y_2$ in $C$ such that $$ \|x_1-(1+b)y_1\|< \dist(x_1,(1+b)C) +\varepsilon , $$ and $$ \|z_2-y_2\|< \dist(z_2,C) +\varepsilon . $$ Then $$ \dist(x_1-z_2,bC)\leq \|x_1-z_2 - by_1\| = $$$$ = \|x_1-(1+b)y_1- (z_2 -y_2) + y_1 - y_2\|\leq $$$$ \leq \|x_1-(1+b)y_1 \|+\|z_2 - y_2\| + \|y_1 - y_2\|\leq $$$$ \leq \dist(x_1,(1+b)C) + \dist(z_2,C) +2\varepsilon + \diam(C). $$ Since $\varepsilon $ is arbitrary and $\diam(C)\leq 2$, we see that (3) is satisfied. QED


This said, let us be given a non-reflexive space $X$. By Smulian's Theorem (see below) choose a nested sequence $C_1 \supseteq C_2 \supseteq C_3 \supseteq \cdot \cdot \cdot \ $ of nonempty closed bounded convex subsets with empty intersection. By scaling everything by the same fixed factor we may assume that each $C_n$ is contained in the unit ball of $X$, and hence, in particular, $\diam(C_n)\leq 2$.

For each $n$ let $\|\cdot\|_n$ be the norm on $X\oplus \R$ constructed in the Lemma relative to $C_n$, and define a new norm on $X\oplus \R$ by $$ \|(x, a)\| = \sum_{n=1}^\infty \frac{\|(x, a)\|_n}{2^n}. $$

Since $\|(x, 0)\|_n = \|x\|$, for every $n$ and every $x\in X$, and since $\sum_{n=1}^\infty 1/2^n=1$, we see that $\|(x, 0)\| = \|x\|$, so $X$ also embeds into $X\oplus \R$ relative to the above norm.

We now claim that there is no point in $X$ (or rather in its canonical image in $X\oplus \R$) minimizing the distance to the vector $u:= (0,-1)$. To see this notice that for every $x$ in $X$ one has that $$ \|x-u\|_n = \|(x, 0)-(0, -1)\|_n = \|(x, 1)\|_n = 1 + \dist(x,C_n) \geq 1, $$ with equality iff $x\in C_n$. Therefore $$ \|x-u\| \geq \sum_{n=1}^\infty \frac 1{2^n} = 1, $$ so we see that $\dist(u,X)\geq 1$. On the other hand, if $x\in C_k$, then $$ \|x-u\| = 1 + \sum_{n=k+1}^\infty \frac{\dist(x,C_n) }{2^n} $$ which can be made as close as desired to 1 if $k$ is large enough, meaning that in fact $\dist(u,X)=1$.

The punch line is then that there is no $x$ in $X$ such that $$ \|x-u\| = 1 + \sum_{n=1}^\infty \frac{\dist(x,C_n) }{2^n} = 1, $$ since such an $x$ needs to be in every $C_n$, but we know that $\bigcap_nC_n$ is empty.


Not many people read Russian, so here is the statement:

Theorem (Smulian [1]) A normed space $X$ is reflexive iff every nested sequence $C_1 \supseteq C_2 \supseteq C_3 \supseteq \cdot \cdot \cdot \ $ of nonempty closed bounded convex subsets of $X$ has nonempty intersection.

[11] V.L. Smulian, On the principle of inclusion in the space of the type (B), Mat. Sb. (N.S.) 5 (1939) 317-328.

Related Question