Does $f'(x)\neq 0$ already imply that a function $f:\mathbb{R}\to\mathbb{R}$ is monotonic

derivativesreal-analysis

I am observing a differentiable function $f: I\to\mathbb{R}$ where $I\subseteq\mathbb{R}$ is a connected interval, with the condition $f'(x)\neq 0$ for every $x\in I$.

Now I am asking myself the question how "messy" a derivative can look under these circumstances.

I want to prove that such a function is already (strictly) monotonic (decreasing, or increasing).

The example

$f:\mathbb{R}\setminus\{0\}\to\mathbb{R}, x\mapsto 1/x$ shows that it is necessary to have a connected interval.

I tried to prove it like this.

Let $a,b\in I$ with $a<b$. Now I want to show that $f(a)<f(b)$ or $f(a)>f(b)$.

Using the mean-value theorem, I have

$\frac{f(a)-f(b)}{a-b}=f'(\xi)\neq 0$. Hence positive or negative.

Then $f(a)-f(b)$ is positive or negative.

The problem is that I would have to prove that this relation never changes for every $a,b\in I$. So once positive/negative means, always positive/negative, which is not immediate.

I can not come up with a counterexample to this, but this looks like it should need continuously differentiable, as $f'(x)\neq 0$ implying that $f'(x)$ is already always positive/negative seems like an intermediate-value kind off argument, and I wonder if you can walve this additional condition, when we have a connected interval.

With other words:

How messy can a derivative really look?

Thanks in advance.

Best Answer

Continuous differentiability is not necessary. The answer to your title question is yes. Of course, without continuous differentiability, we CANNOT invoke the standard intermediate value theorem on the derivative $f'$, because it isn't assumed continuous. On the other hand, derivatives have the intermediate-value property; this is known as Darboux's Theorem (for analysis).

Suppose for contradiction there exist $a,b\in I$ with $a<b$ such that $f'(a)$ and $f'(b)$ have opposite signs. Since $I$ is assumed connected, the entire interval $[a,b]$ lies in $I$. Hence, we may apply Darboux's intermediate value theorem to deduce that $f'$ must take on the value $0$ at some point in $(a,b)$; but this clearly contradicts the hypothesis.

Hence, we have either $f'>0$ on $I$, or $f'<0$ on $I$. In the first case, $f$ is strictly increasing (by mean-value theorem) while in the second case, $f$ is strictly decreasing.

Related Question