Functional Analysis – Natural Proof of the Cauchy-Schwarz Inequality

functional-analysisinequality

Most of the proofs of the Cauchy-Schwarz inequality on a pre-Hilbert space use a fact that if a quadratic polynomial with real coefficients takes positive values everywhere on the real line, then its discriminant is negative(e.g. Conway: A course in functional analysis).
I think this is somewhat tricky.
Moreover I often forget its proof when the pre-Hilbert space is defined over the field of complex numbers.
Is there a more natural proof (hence it's easy to remember) which is based on a completely different idea?

Best Answer

There is also an approach by "amplification" which is really cool. Also the exact same trick works to prove Hölder's inequality and is generally a very important principle for improving inequalities.

It goes like this: We start out with $$\langle a-b,a-b\rangle\ge 0$$ for $a,b$ in your inner product space, and $a\not=0$, $b\not=0$. This implies $$2\langle a,b\rangle\le \langle a, a\rangle + \langle b, b\rangle$$ Now notice that the left hand side is invariant under the scaling $a\mapsto \lambda a$, $b\mapsto \lambda^{-1}b$ for $\lambda>0$. This gives $$2\langle a,b\rangle \le \lambda^2 \langle a,a\rangle + \lambda^{-2}\langle b, b\rangle$$ Now look at the right hand side as a function of the real variable $\lambda$ and find the optimal value for $\lambda$ using calculus (set the derivative to $0$):

$$\lambda^2=\sqrt{\frac{\langle b,b\rangle}{\langle a,a\rangle}}$$

Plugging this value in, we obtain

$$2\langle a,b\rangle\le \sqrt{\langle a,a\rangle}\sqrt{\langle b,b\rangle}+\sqrt{\langle a,a\rangle}\sqrt{\langle b,b\rangle}$$

i.e.

$$\langle a,b\rangle\le\sqrt{\langle a,a\rangle}\sqrt{\langle b,b\rangle}$$

Notice how we took a trivial observation and "optimized" the expression by exploiting scaling invariance.

Related Question