Is this proof of Darboux’s Theorem valid

calculuscontinuityderivativesfake-proofsreal-analysis

Let $f:\Bbb [a,b]\to \Bbb R$ satisfy the following :

(i) $f(x)$ is derivable in $[a,b]$

(ii) $f'(a)=\alpha\neq f'(b)=\beta$ and

(iii) $\exists \gamma\in (\alpha,\beta)$

then, there exists at least one value of $x$ say $\xi$ between $a$ and $b$ such that $f'(\xi)= \gamma.$

I tried to prove the above theorem popularly called "Darboux's Theorem" as follows:

Suppose $F(x)=f(x) – \gamma(x – a),$ then $F'(x)= f'(x) – \gamma.$ Since $F'(x)$ exists in $[a, b],$ $F(x)$ is continuous there and must attain its lower bound $m$ at some point $\xi$ of the interval. So, then, $\xi$ will be a point of local extrema of $F$ and this means, $F'(\xi)=0\implies f'(\xi)=\gamma.$ This proves the theorem.


I want to know if the above proof is valid or not? This is because, in almost all the books, I have gone through so far, they do the above thing in a more round about way as below:

The part where I wrote:

So, then, $\xi$ will be a point of local extrema of $F$ and this means, $F'(\xi)=0\implies f'(\xi)=\gamma.$

is done like this in the books:

But since $F'(a)=f'(a)-\gamma < 0,$ $F'(b)=f'(b)-\gamma> 0$ $,\xi$ cannot be $a$ or $b.$ This is because of the following theorem:

According as $f'(c)$ is positive or negative, $f(x)$ is increasing or decreasing in a suitably restricted neighborhood of $c.$

Hence at some point $\xi$ between $a$ and $b,$ $F(x)$ attains its lower bound $m$ and $F'(\xi)=0.$ Hence $f'(\xi)=\gamma$ for $a < \xi< b.$ This proves the result.

But I feel we don't need to do it in such a complicated way and since nearly all the books, that I have gone through, does this thing in the way above, i.e by invoking the theorem:

According as $f'(c)$ is positive or negative, $f(x)$ is increasing or decreasing in a suitably restricted neighborhood of $c$

I am confused, whether my proof at all works or not.

Best Answer

Your proof can not be correct because it did not use the condition (iii) that $\gamma\in (\alpha,\beta)$. So let us consider an example which demonstrates that the theorem is wrong without that condition, and check where your proof fails.

$f:[0, 1] \to \Bbb R$ , $f(x) = x^2$ satisfies conditions (i) and (ii) with $\alpha = f'(0) = 0 \ne \beta = f'(1) = 2$. $f'(x) = 2x$ takes exactly all values in $[0, 2]$ on that interval. So what if $\gamma = 3$, say?

Your proof considers the function $F(x)=f(x) - \gamma(x - a) = x^2 - 3x$, $F'(x) = 2x - 3$. Here is a graph of that function:

enter image description here

It is correct that $F$ attains it minimum at some point $\xi \in [0, 1]$. If $\xi$ is an interior point of the interval then one can conclude that $F'(\xi) = 0$. But that is not necessarily the case. In our example is the minimium attained at $\xi=1$, and $F'(1) = -1 < 0$. That's why your proof is not valid.

The “real“ proof uses $\alpha < \gamma < \beta$, which means that $F'(a) < 0$ and $F'(b) > 0$, and concludes that the minimum it not attained at an endpoint of the interval. The rest of the proof is as in your argument.

Summary: In your proof, you have used the following theorem:

Let $c$ be an interior point of the interval $I$ at which $f:I\to \Bbb R$ has a relative extremum. If the derivative of $f$ at $c$ exists, then $f'(c) = 0.$

It's true that $a\leq \xi\leq b$, but the theorem can only be applied if $a< \xi\ < b$, which we can not guarantee. If $\xi = a$ or $\xi = b$ then the theorem can not be applied.

Related Question