[Math] Using the Mean Value Theorem, show that if $f'(x) > 0$ $\forall x \in (a, b)$ then $f$ is increasing on $(a, b)$

calculusderivativesfunctionsproof-verification

Using the Mean Value Theorem, show that if $f'(x) > 0$ $\forall x \in (a, b)$ then $f$ is increasing on $(a, b)$.


The Mean Value Theorem states: a function $f$ which is continuous on the closed interval $[a, b] $ $^{\textbf{(1)}}$ and differentiable on the open interval $(a, b)$ $^\textbf{(2)}$ has at least one value $c: a < c < b$ where $f'(c) = \dfrac{f(b) – f(a)}{b – a}$.

Set $f(x)$ to be some function such that $f'(x) > 0$ $\forall x \in (a, b) \implies f(x)$ is continuous $\forall x \in [a, b]$ $^\textbf{(1)}$ and differentiable $\forall x \in (a, b)$ $^{\textbf{(2)}}$. Hence, $\exists c: f'(c) = \dfrac{f(b) – f(a)}{b – a} \implies f'(c)(b – a) + f(a) = f(b) \implies f(b) > f(a) \therefore f$ is increasing on $(a, b)$.


Is this proof sufficient?

Best Answer

No, your proof only proves that $f(b)>f(a)$. for example, the function $$f(x)=x^2$$ on the interval $(-1,2)$ also satisfies the condition that $f(2)>f(1)$, but $f$ is clearly not increasing at all points.

You need to prove that, for any pair $x,y\in(a,b)$, if $x<y$, then $f(x)<f(y)$.