Proof of Young’s Inequality – Calculus and Real Analysis

calculusreal-analysis

The following problem is from Spivak's Calculus.

Suppose that $f$ is a continuous increasing function with $f(0)=0$. Prove that for $a,b \gt 0$ we have Young's inequality

$$ ab \le \int_0^af(x)dx+\int_0^bf^{-1}(x)dx$$, and that equality holds if and only if $b=f(a)$.

It is enough to consider the case $f(a) \gt b$, and show that the strict inequality occurs in this case.

I've tried proving this using the theorem
$$ \int_a^bf^{-1}=bf^{-1}(b)-af^{-1}(a)-\int_{f^{-1}(a)}^{f^{-1}(b)}f$$

but I got stuck along the way.

How may I show this rigorously using the definition or properties of integrals? Any hint, suggestions or solutions would be appreciated.

Best Answer

Assuming $f(a)>b$, we have: $$ \color{red}{\int_{0}^{a} f(x)\,dx} = \mu\left(\{(x,y)\in[0,a]\times[0,f(a)]: 0\leq y\leq f(x)\}\right) $$ and: $$ \color{blue}{\int_{0}^{b} f^{-1}(y)\,dy} = \mu\left(\{(x,y)\in[0,a]\times[0,b]: 0\leq x\leq f^{-1}(y)\}\right) $$ so the sum of the two integrals surely exceeds $\mu\left([0,a]\times[0,b]\right)=ab$.

By the way, it is a lot easier just to draw a picture:

$\hspace3in$enter image description here

Related Question