“Normalizing” a sum in Cauchy Schwarz Master Class

inequalitysummation

I've been going through The Cauchy Schwarz Master Class by J Michael Steele, and early on he does the following manipulation

$$\sum_{k=1}^\infty |a_k b_k| \leq \frac{1}{2} \sum_{k=1}^\infty a_k^2 + \frac{1}{2} \sum_{k=1}^\infty b_k^2. \quad \quad (1.6)$$

A Test of Strength:

Any time one meets a new inequality, one is almost duty bound to test the strength of that inequality. Here that obligation boils down to asking how close the new additive inequality comes to matching the quantitative estimates that one finds from Cauchy's inequality.

The additive bound $(1.6)$ has two terms on the right-hand side, and Cauchy's inequality has just one. Thus, as a first step, we might look for a way to combine the two terms of the additive bound $(1.6)$ and a natural way to implement this idea is to normalize the sequences $\{a_k\}$ and $\{b_k\}$ so that each of the right-hand sums is equal to one.

Thus if neither of the sequences is made up of all zeros, we can introduce new variables

$$
\hat{a_k} = a_k \big / \left ( \sum_j a_j^2 \right )^{\frac{1}{2}}
\quad \text{ and } \quad
\hat{b_k} = b_k \big / \left ( \sum_j b_j^2 \right )^{\frac{1}{2}},
$$

which are normalized in the sense that

$$
\sum_{k=1}^\infty \hat{a_k}^2 =
\sum_{k=1}^\infty \left \{ a_k^2 \big / \left ( \sum_j a_j^2 \right ) \right \} = 1
$$

and

$$
\sum_{k=1}^\infty \hat{b_k}^2 =
\sum_{k=1}^\infty \left \{ b_k^2 \big / \left ( \sum_j b_j^2 \right ) \right \} = 1.
$$

I don't really understand what's going on here. Google was unhelpful in explaining what the sum(j) notation means, and I'm not sure how this formula is normalized. If anyone can explain what is happening here, that would be super helpful

Best Answer

"Normalized" in this context means "length $1$". So if you think of our sequences as infinite length vectors:

$$(a_1, a_2, \ldots ) \text{ and } (b_1, b_2, \ldots)$$

we can normalize them by looking at the new vectors

$$ \frac{(a_1, a_2, \ldots)}{\lVert (a_1, a_2, \ldots) \rVert} \text{ and } \frac{(b_1, b_2, \ldots)}{\lVert (b_1, b_2, \ldots) \rVert}. $$

Here, intuitively, we divide by the length of our vector in order to make it have length $1$.

Now we recall $\lVert(a_1, a_2, \ldots)\rVert = \sqrt{\sum_{j=1}^\infty a_j^2}$, and we see that the new $\hat{a_k}$s are exactly entries of this new (normalized) vector. That is, $\hat{a_k} = \frac{a_k}{\lVert (a_1, a_2, \ldots) \rVert}$.

Notice this really does normalize our vectors in the sense that they end up having length $1$. This is what the author shows at the bottom of your screenshot:

$$ \lVert (\hat{a_1}, \hat{a_2}, \ldots) \rVert = \left \lVert \frac{(a_1, a_2, \ldots)}{\lVert (a_1, a_2, \ldots) \rVert} \right \rVert = \frac{\lVert (a_1, a_2, \ldots) \rVert}{\lVert (a_1, a_2, \ldots) \rVert} = 1. $$

The author just writes this out explicitly (and uses the fact that $\sqrt{1} = 1$).

As for $\displaystyle \sum_j a_j$, that just means "sum over all the $j$". Life is too short to write $\displaystyle \sum_{j=1}^\infty$ all the time, so a lot of authors will just write $\displaystyle \sum_j$ instead, and trust the reader to work out what the endpoints of the sum should be. For this particular sum, it's $1$ to $\infty$, since we're summing over every entry of our infinite vector.


I hope this helps ^_^

Related Question