[Math] Use the Mean Value Theorem to prove that if a continuous and differentiable function $f$ has two roots, then $f’$ has at least one root

calculus

Use the Mean Value Theorem to prove that if a continuous and differentiable function $f$ has two roots, then $f'$ has at least one root.

My thought process so far:

The Mean Value Theorem tells us that if function $f$ is continuous and differentiable on an interval $[a,b]$, there is at least one $c\in [a,b]$ where

$$f'(c)=\frac{f(b)-f(a)}{b-a}.$$

If $f$ has two roots, it means that $f$ takes the value $0$ at two points.

If $f$ is continuous on an interval $[a,b]$ and has two roots, this means that there is a point $c\in [a,b]$ where $f'(x)=0$.

I'm thinking about approaching this task with a contradiction proof and assume the opposite: if $f(x)=0$ two times, then $f'$ never takes the value $0$ (I'm not sure if this is the best way though).

I'm getting stuck here and I'm not quite sure how the MVT is applicable to this task. At first, I was thinking writing out the form of the functions which do have two roots for example $ax^2+bx+c$ but doesn't the degree of the polynomial only tell you how many roots it has at least?

Best Answer

Let $f(x)$ have two roots at, say, $x_1$ and $x_2$. It thus follows that $f(x_1)=f(x_2)=0$.

Hence, by Rolle’s Theorem, we have that there is a number $c_1$ in $(x_1,x_2)$ such that $f’(c_1)=0$.