[Math] Derivative inequality for a twice continuously differentiable function.

calculuscontinuityderivativesreal-analysis

This is a question from a past exam. I thought that this was easy, but found no way of solving it.

Let $f: \mathbb R\rightarrow\mathbb R$ be continuously twice differentiable with $f''(x)\gt0\forall x\in\mathbb R$. Then
(a) if $a\lt b$, then $f'(a)\lt\dfrac{f(b)-f(a)}{b-a}\lt f'(b)$.
(b) if $a\lt x\lt b$, then $f(x)\lt f(a)+\dfrac{f(b)-f(a)}{b-a}(x-a)$.

For $(a)$, I thought that the mean-value theorem suffices, but later I found that, if $f(b)-f(a)=f'(a)$, then the inequality should not hold. How can I prohibit from this to happen?
For $(b)$, I also tried using mean-value theorem, but to no avail. Also, I tried rewriting it as $\dfrac{f(x)-f(a)}{x-a}\lt\dfrac{f(b)-f(a)}{b-a}$. However, I got nothing from this expression either, though I think it has something to do with the solution.
Thanks for any help in advance.

Best Answer

Hint: If $f$ is convex then $\displaystyle g:x \mapsto \frac{f(x)-f(a)}{x-a}$ is monotically non-decrasing on $[a,+ \infty)$.

Let $a<x<y$. There exists $\lambda \in (0,1)$ such that $x= (1-\lambda)a+\lambda y$, hence $\displaystyle g(x)= \frac{1}{\lambda} \frac{f((1-\lambda)a+\lambda y)-f(a)}{y-a} \leq \frac{(1-\lambda)f(a)+\lambda f(y) -f(a)}{y-a} = g(y)$.