[Math] Prove $f$ is strictly increasing over an interval containing $a$

calculus

Suppose $f$ is differentiable over an open interval containing $a$. Suppose $f'(a) > 0$ and $f'$ is continuous at $a$. Prove $f$ is strictly increasing over an interval containing $a$.

Choose a sufficiently small interval $J$ to be $(b,c)$ so that $b < a < c$. Thus by Rolles Theorem $\exists c'$ such that $f'(c') = 0$ where $c' \in J$. Thus the function went from increasing to decreasing or decreasing to increasing at some point.

Does this complete the proof or do I need to be more rigorous?

Best Answer

Since $f'$ is continuous, we have a $\delta>0$ such that in a neighbourhood $(a-\delta, a+\delta)$, $f'>0$. This is a general property of continuous functions not difficult to prove.

Suppose $f$ is not strictly increasing in $(a-\delta, a+\delta)$. From this supposition, we will derive a contradiction. If it is not strictly increasing, for some $x,y \in (a-\delta, a+\delta)$ such that $y>x$, $$\frac{f(y) - f(x)}{y-x} \leq 0$$

By the mean value theorem, we thus have some $c \in (x,y) \subset (a-\delta, a+\delta)$ such that $f'(c) = \frac{f(y) - f(x)}{y-x} \leq 0$. However, this is absurd, since we have already demonstrated that $f'>0$ in $(a-\delta, a+\delta)$. We therefore conclude, by contradiction, that $f$ is strictly increasing in $(a-\delta, a+\delta)$.

Related Question