[Math] Between $2$ consecutive roots of $f’$, there is at least one root of $f$

algebra-precalculuscalculusfunctionsreal-analysis

Prove that between $2$ consecutive roots of $f'$, there is at least one root of $f$.

I understand that a root of $f'$ represents an extreme point. But, for example, $f(x) = \sin(x)+2$ has no roots, but its derivative, $\cos(x)$, has lots of consecutive roots.

Ok, while I was writing this, I realized that its no "at least" but "there is at most" one root of $f$.

So, I understand that, between $2$ consecutive maximum points, for example, there can be one root, but if that function tries to come back and make another root between the two max poits, it's gotta create a local maximum point between them. But how do I write this mathematically?

Let me try:

By Rolle's, between $2$ consecutive roots $f(a) = f(b)$, there must be a point $c\in [a,b]$ where $f'(c) = 0$, which is a maximum point.

Or maybe, can I say the following:

between two roots of $f'(x)$, let's say, $f'(m) = f'(n)$ by rolles theorem we have:

$$\exists c\in [m,n] / f''(c) = 0$$

so there's a maximum point between the roots, but I don't know how to prove that this is the only maximum point, and that this maximum point leads to only one root.

Best Answer

Suppose that $f'(a)=f'(b)=0$ with $a<b$ and that $f'\not = 0$ in $]a,b[$. If $f$ has two distinct roots $x$ and $y$ in $]a,b[$, then, by Rolle's theorem (since $f(x)=f(y)=0$), there is $c$ in $[x,y]$ such that $f'(c)=0$. Contradiction.