[Math] Show that if f is differentiable and f'(x) ≥ 0 on (a, b), then f is strictly increasing

real-analysis

Show that if f is differentiable and f'(x) ≥ $0$ on (a, b), then f is strictly increasing provided there is no sub interval (c, d) with с < d on which f' is identically zero.

So so far I'm trying to do this by contradiction:

Suppose not, that is suppose we have function $f$ where f(x)$\geq$$0$ on (a,b) where f ' is not identically $0$ for a sub interval of (a,b) and f is not strictly increasing. Since f is not strictly increasing this implies there exists $x_1$ and $x_2$ where a <$x_1$ < $x_2$ < b and f($x_1$) = f($x_2$). Then for all y $\in$ [$x_1$, $x_2$], f(y) = f($x_2$) which means that f is constant and f '(y) = 0.

Since f'(y)=$0$ for all y $\in$ [$x_1$, $x_2$] this means f ' is identically $0$ which is a contradiction. Thus f is strictly increasing. $\square$

I'm not sure if there is a better way to do this but any help or comments would be appreciated!

Best Answer

Your idea of going by contradiction is a good one, but it is not performed all that well. In particular, your argument that there exist $x_1<x_2$ such that $f(x_1)=f(x_2)$ is ok, but it is not clear how to conclude from that the fact that $f$ is constant of $[x_1,x_2]$.

My advice is to either use Rolle's theorem on $x_1,x_2$, or to restart your proof, write exactly what "not strictly increasing" means, and go with the classics and employ Lagrange's mean value theorem.