[Math] Strictly increasing and strictly decreasing derivatives

calculusmultivariable-calculusproof-writing

How do I prove this? We don't have that strictly increasing implies $f' > 0$, but we do have the converse; and, things like Rolle's Theorem and statements of the mean value theorem, and that $f'$ exists implies that $f$ is continuous:

Suppose $f'$ exists on $(a, b)$ and $f'(x) \neq 0$ on $(a, b)$. Prove
that either $f'(x) > 0$ for all $x \in (a, b)$ or $f'(x) < 0$.

My attempts don't seem rigorous.

Best Answer

I'm assuming the conclusion is supposed to be that $f'(x)$ is either always positive or always negative (rather than $f(x)$), since the conclusion is false for $f(x)$ (If it were true for a function $f$, then doing a suitable vertical translation you would be able to get a function which is sometimes positive and sometimes negative on $(a,b)$, without changing the derivative.

Assume there are points $d,e$ with$ a\lt d\lt e\lt b$ and $f'(d)\lt 0 \lt f'(e)$.

By the Extreme Value Theorem, $f(x)$ achieves a maximum and a minimum on $[d,e]$, and this minimum is achieved at a critical point or at an endpoint. By hypothesis, there are no critical points, so the minimum must be achieved at $d$ or at $e$.

Show that the fact that $f'(d)\lt 0$ implies that $f$ cannot achieve its minimum over $[d,e]$ at $d$. One can do this by using the definition of derivative as a limit.

Added. Here is how one can prove this without trying to show that $f$ is decreasing on a neighborhood of $d$: by definition, $$\lim_{h\to 0}\frac{f(d+h)-f(d)}{h} = f'(d)\lt 0.$$ In particular, $$\lim_{h\to 0^+}\frac{f(d+h)-f(d)}{h} = f'(d)\lt 0.$$ By definition of the limit, for every $\epsilon\gt 0$ there exists $\delta\gt 0$ such that if $0\lt h\lt \delta$, then $\left|\frac{f(d+h)-f(d)}{h} - f'(d)\right|\lt \epsilon$. Take $\epsilon= |f'(d)/2$, and let $\delta_0$ be the corresponding $\delta$. Then for any $h$, $0\lt h\lt \delta_0$ we have $$\left|\frac{f(d+h)-f(d)}{h} - f'(d)\right| \lt \frac{|f'(d)|}{2}.$$ From this, we conclude that $\frac{f(d+h)-f(d)}{h}\lt 0$. Since $h\gt 0$, that means that $f(d+h)-f(d)\lt 0$, or $f(d+h)\lt f(d)$. This holds for all $h$ with $0\lt h\lt \delta_00$. That means that $f(d)$ is strictly larger than all values of $f$ on $(d,d+\delta_0)$, so $f(d)$ cannot be the minimum on $[d,d+\delta_0)$, and so cannot be the minimum on $[d,e]$.

Likewise, show that the fact that $f'(e)\gt 0$ implies that $f$ cannot achieve its minimum over $[d,e]$ at $e$. This contradiction arises from the assumption that we can find points $d$ and $e$ with $f'(d)\lt 0 \lt f'(e)$. Replace $f$ with $-f$ to show you cannot find points $d$ and $e$ with $f'(d)\gt 0 \gt f'(e)$ either.