Let $f$ be continuous. If $f(x) = 0 \implies f$ is strictly increasing at $x$, then $f$ has at most one root.

continuityreal-analysisrootssolution-verification

This is similar to this question I asked yesterday. I just need someone to check my proof (or offer an alternative proof) of the following statement

Let $f : \mathbb R \rightarrow \mathbb R: x \mapsto f(x)$ be a continuous function. If $f(x) = 0 \implies f$ is strictly increasing on an open neighbourhood of $x$, then $f$ as at most one root.

Here's my attempt at a proof by contradiction.

Case 1. Let $x_1 < x_2$ be two roots with no other root in (x_1,x_2) . Since $f$ is strictly increasing on a neighbourhood of each root we can find $\delta > 0$ such that $f> 0$ on $(x_i,x_i+\delta)$ and $f<0$ on $(x_i-\delta,x_i)$.

Using the intermediate value theorem we can find another root $c$ somewhere between $x_1$ and $x_2$ , a contradiction.

Case 2. By the first part we can always find a root of $f$ between any two given roots of $f$. Let x_1 < x_2 be two roots.

We will show that $f = 0$ on $(x_1,x_2)$ which contradicts the fact that $f$ is strictly increasing at it's roots.

Let $\tilde x \in (x_1,x_2).$

Define $$x_1' = \sup \{ x \in [x_1,\tilde x] : f(x) = 0 \}$$
$$x_2' = \inf\{x \in [\tilde x,x_2]: f(x) = 0\}.$$

Since $x_i'$ is the $\inf$ (or $\sup$) of a bounded set we can find a sequence of roots wich converges to $x_i'$ so by continuity of $f$ we have $f(x_1') = f(x_2') = 0.$

Clearly $ x_1' \leq \tilde x \leq x_2'$ so we need only consider the two following cases

  • If $\tilde x = x_1'$ or $\tilde x = x_2'$ then $f(\tilde x) = 0.$

  • If $\tilde x \in (x_1',x_2')$ then since $x_1'$ and $x_2'$ are roots we can find a new root $c$ in $(x_1',x_2')$. If $\tilde x \leq c$ then we have a contradiction with the definition of $x_2'$ and similarly $c \leq \tilde x$ contradicts the definition of $x_1'$.

Therefore we must have $f(\tilde x) = 0.$

Therefore $f$ cannot have several roots since $f$ would then be equal to $0$ on an interval which contradicts the fact that $f$ is strictly increasing on a neighboorhood of it's roots.

Best Answer

It looks correct, except for one thing. In Case 1, you wrote “Let $x_1<x_2$ be two roots with no other root in $(x_1,x_2)$.” What you wrote after that is fine. But then, in Case 2, you wrote “By the first part we can always find a root of $f$ between any two given roots of $f$”. But in Case 1 you had an extra assumption, namely that there is no root between $x_1$ and $x_2$. So, you cannot apply Case 1 to any two given roots of $f$.

My suggestion then is this: do your proof in two steps:

  1. prove that between any two distinct roots of $f$ there has to be another root;
  2. use this to prove what you want to prove.