On compatible and natural norms (Exercise 2.7.12b in Kreyszig’s Functional Analysis text)

functional-analysislinear algebranormed-spacessupremum-and-infimum

$
\newcommand{\nc}{\newcommand}
\nc{\C}{\mathbb{C}}
\nc{\F}{\mathbb{F}}
\nc{\R}{\mathbb{R}}
\nc{\a}{\alpha}
\nc{\n}[1]{\left \Vert #1 \right \Vert}
\nc{\abs}[1]{\left \vert #1 \right \vert}
\nc{\set}[1]{\left \{ #1 \right \}}
\nc{\0}{\vec 0}
\nc{\ds}{\displaystyle}
$
The Problem: This comes as Exercise $12$ in Section $2.7$ in Kreyszig's textbook Introduction to Functional Analysis with Applications, which I'm self-studying and working through over the summer. In summary, the problem is this:

Let $\F \in \left\{\R,\C\right\}$. We know a matrix $A \in \F^{r \times n}$ with $A := (\a_{i,j})_{1 \le i \le r, 1\le j \le n}$ defines a linear operator $\F^n \to \F^r$.

Suppose any norm $\n\cdot_1$ is given on $\F^n$, and any norm $\n\cdot_2$ is given on $\F^r$. We know there are many norms on the space $\F^{r \times n}$.

A norm $\n \cdot$ on $\F^{r \times n}$ is said to be compatible with $\n \cdot_1$ and $\n \cdot_2$ if
$$
\n{Ax}_2 \le \n{A} \cdot \n{x}_1 \; \forall x \in \F^n
$$

We may define an operator norm by
$$
\n A := \sup_{0 \ne x \in \F^n} \frac{\n{Ax}_2}{\n{x}_1}
$$

Showing this is a norm is trivial, as is that it is always compatible with $\n \cdot_1$ and $\n \cdot_2$. We call this the "natural norm" defined by the pair $\n \cdot_1$ and $\n \cdot_2$.

The Goal: Given the norms
$$
\n{x}_2 := \max_{1 \le i \le n} \abs{ \xi_i} \qquad
\n{y}_1 := \max_{1 \le j \le r} \abs{ \eta_j }
$$

(where $x := (\xi_i)_{i=1}^n \in \F^n$ and $y := (\eta_j)_{j=1}^r \in \F^r$), show that the natural norm induced by the pair is
$$
\n{A} = \max_{1 \le i \le r} \sum_{i=1}^n \abs{ \a_{i,j} }
$$


My Work So Far:

So far, I've managed to get what I feel is halfway there:

Note that for any $x := (\xi_i)_{i=1}^n \in \F^n \setminus \set \0$,
$$
\n{Ax}_2
= \max_{1 \le i \le r} \abs{ \sum_{j=1}^n \a_{i,j} \xi_j }
\le \max_{1 \le i \le r} \sum_{j=1}^n \abs{\a_{i,j} } \abs{\xi_j }
\le \max_{1 \le i \le r} \sum_{j=1}^n \abs{\a_{i,j} } \max_{1 \le j \le r} \abs{\xi_j }
= \n{x}_1 \cdot \max_{1 \le i \le r} \sum_{j=1}^n \abs{\a_{i,j} }
$$

which gives us
$$
\frac{\n{Ax}_2}{\n{x}_1} \le \max_{1 \le i \le r} \sum_{j=1}^n \abs{\a_{i,j} }
$$

Since only the left-hand side depends on $x$, taking the supremum over the desired $x$ maintains the inequality:
$$
\n{A} := \sup_{\substack{x \in X \\ x \ne \0}} \frac{\n{Ax}_2}{\n{x}_1} \le \max_{1 \le i \le r} \sum_{j=1}^n \abs{\a_{i,j} }
$$

This gives us half of what we need, and I want to show the reverse half of the inequality. However, I'm struggling to do so.

Applying some basic definitions gets me to (for arbitrary nonzero $x_0 := (\xi_i)_{i=1}^n \in \F^n$)
$$
\sup_{\substack{x \in X \\ x \ne \0}} \frac{\n{Ax}_2}{\n{x}_1}
\ge \frac{\n{Ax_0}_2}{\n{x_0}_1}
= \frac{\ds \max_{1 \le i \le r} \abs{ \sum_{j=1}^n \a_{i,j} \xi_j }}{\ds \max_{1 \le k \le r} \abs{\xi_k} }
$$

I can't see an easy or meaningful way to increase the denominator, or bring it inside the sum. (Bringing it inside the sum gives us $\abs{\xi_j}/\max \abs{\xi_k} \le 1$, which is too much for our needs.) I'm also not sure how to augment the numerator; sure, I could make it smaller, but that makes us lose the max function that we need in the end, so I don't think that's the right idea either…


In Short: Does anyone have any ideas as to how I might prove
$$
\n{A} := \sup_{\substack{x \in X \\ x \ne \0}} \frac{\n{Ax}_2}{\n{x}_1} \ge \max_{1 \le i \le r} \sum_{j=1}^n \abs{\a_{i,j} }?
$$

Thanks in advance for any hints and suggestions!

Best Answer

Using "some basic definitions" will give you a result that holds for generic choices of $x$. But the supremum doesn't arise from the usual sort of $x$; it arises from very special ones (namely, those that maximize the term in the supremum).

So you'll need to think about what $x$ make the expression $\frac{\|Ax\|_2}{\|x\|_1}$ behave unusually. Since you're having trouble working with the denominator, maybe looking for $x$ with an unusually simple $\|x\|_1$ might be a good place to start….

Related Question