The Mean Value Theorem ( Increasing at point)

calculuscontinuityderivativesreal-analysis

As I go over some important topics in Calculus (by Salas), he stated such a theorem;

"Suppose that $\ f(x)\ $ differentiable at $\ x=c,\ $ if $\ f'(c)>0,\ $ then

$f (c − h) < f (c) < f (c + h)\ $
for all positive $\ h\ $ sufficiently small. "

But some times we cannot find such "$\ h\ $" at some point even though the derivative there is positive; ex.

$f(x)=$\begin{cases}x+2x^2sin(1/x) & x\neq\ 0\\ 0 & x=\ 0 \\ \end{cases}
the derivative at $\ 0\ $ equals $\ 1 $; however, it is not possible to find any open interval around zero satisfies the condition since the derivative values fluctuate between negative and positive about zero, this condition satisfies only if the function is increasing at some point. So, did I make a mistake and what is the correct understanding?

Best Answer

It is true that $f$ is not increasing or decreasing in any interval containing the origin, but that is not what the theorem claims.

We can verify the conclusion of the theorem in this case directly: For $0 < |x| < 1/2$ is $$ 1+2x \sin(1/x) \ge 1-2|x| > 0 \, , $$ so that $f(-h) < f(0) < f(h)$ for $0 < h < 1/2$.

In other words: For fixed $c$ are

$f(c-h) < f(c) < f(c+h)$ for sufficiently small positive $h$.

and

$f$ is increasing on $[c-h, c+h]$ for sufficiently small positive $h$.

different statements. The former holds if $f'(c) > 0$, but not necessarily the latter.