Metrics vs. Norms (Fréchet spaces, Banach spaces, etc.)

topological-vector-spaces

While I understand on paper the fundamental differences between Fréchet spaces and Banach spaces, I'm struggling to truly internalize/digest what their respective differences mean in layman's terms.


  1. Fréchet Spaces:

These are locally convex spaces that are complete with respect to some translation-invariant metric; let's call it $\text{dist}(u,v)$.

The obvious interpretation of this is that when two "points" start getting close together — according to the metric — then they must actually be converging to a "point" already in the space. In this way we have some sense of "distance" in our topological space. The translation-invariance just means that "shifting the two points around" by the same displacement won't change how "far apart" they are.


  1. Banach Sapces:

These are also locally convex spaces, complete with respect to a translation-invariant metric. However these have the added condition that the metric is defined in terms of a norm; let's call it $\parallel u \parallel$.

Intuitively, a norm to me means we have some notion for the "length" of a vector in our space. Moreover, any normed space leads to a metric simply by taking the "length" of the difference between two "points":
$$d(u,v) = \parallel\!u-v\!\parallel.$$

This means we interpret two "points" as getting "close" if the "length" of their difference is getting small. This makes a lot of sense, and meshes well with our previous ideas of "closeness" from Fréchet spaces.


So now we come to the part that I'm struggling with. Obviously we might be tempted to force a norm upon a Fréchet space by trying to "reverse" our notions of "distance" and "length":
$$\text{norm}(u) := \text{dist}(u,0).$$
At first glance this doesn't seem so bad since we do have two of three norm properties:

  1. Subadditivity:

$$\begin{align} \text{norm}(u+v) &:= d(u+v,0) \\ &\leq d(u,0) + d(u+v,u), \quad \text{(subadditivity of metric)} \\ &= d(u,0) + d(v,0), \quad \text{(translation invariance of metric)} \\ &=: \text{norm}(u) + \text{norm}(v) \end{align}$$

  1. Point-separating:

$$\text{norm}(u) := \text{dist}(u,0) = 0 \iff u = 0, \quad \text{axiom of mertics}$$

However we run into trouble when we consider the second property:

  1. Absolutely homogeneous:

$$\text{norm}(\alpha u) := \text{dist}(\alpha u,0) \neq |\alpha|\, \text{dist}(u,0) =: |\alpha|\,\text{norm}(u)$$


So my question is how should I make sense of this on a gut level?

Translation invariance means that we always have $d(u,v) = d(u-v,0)$. So how is it different then that $u$ and $v$ might be getting "close" (i.e. $d(u,v)<\epsilon$) in a Fréchet space, but their difference vector isn't "small"? What is going wrong here that keeps us from interpreting "distance from the origin" and "length" as the same thing?

And the fact that a locally convex space's topolgy is generated by a convex, balanced, absorbent basis seems like a prefect setting to talk about how scaling vectors affects their "length"; yet this scaling idea apparently does not play nice with metrics in general? What is the disconnect between scaling vectors vs the "distance" between them?

Any intuition on these ideas would be greatly appreciated. I've almost got the intuition for these kinds of abstract spaces that I want, but I feel like their cumbersome definitions are obscuring a bit of the "bigger" picture for me.

Edit: Thanks, Henno Brandsma, for pointing out that we have the zero equality as well.

Best Answer

In a metric space $d(u,0)=0$ implies $u=0$, (axiom of a metric space) so property 3 is satisfied for this "faux norm", given a linear space distance.

So defining $\|x\|=d(x,0)$ only fails to be a norm because of the essential property $\|t\cdot x\|=|t|\|x\|$ for norms.

An example of a Fréchet space that is not normable, is $\mathbb{R}^{\mathbb{N}}$ in its standard complete metric (inducing the product topology): $$d((x_n), (y_n))=\sum_{n=0}^\infty \frac{1}{2^n} \min(1,|x_n - y_n|)$$

It's easy to see that $d$ is translation-invariant, and its induced "faux-norm" $d(x,0)$ does not obey property 2, but of course does obey 1 and 3.

In general, when looking at these concepts of different types of linear spaces, it's good to compare concrete examples. For these product topologies , not that not all coordintes behave the same: in the metric, what displacements happen in high coordinates contributes less to the metric distance (because basic open sets essentially depend on finitely many coordinates, this is unavoidable, and any metric inducing the product topology will have this effect, here I chose the standard metric induced by the seminorms $\|(x_n)\|_m=|x_m|, m \in \Bbb N$, in fact)

There is no norm that can induce the same topology on $\mathbb{R}^{\mathbb{N}}$ because all open neighbourhoods of $0$ are not "bounded" and in a normed space, (absolute) homogeneity is used in a strong way to show that all open balls are bounded in the linked sense.

Related Question