Understand a proof of general derivative test with Taylor-Formula.

derivativesreal-analysistaylor expansion

I'm trying to understand a proof of the general derivative test which states:

Let $f:[a,b]\rightarrow \mathbb{R}$ be a function that is differenciable $n$ times, and let $x_0 \in]a,b[$ be point with $f^{\prime}(x_0)=0, f^{\prime\prime}(x_0)=0, …, f^{(n-1)}(x_0)=0$ but $f^{(n)}(x_0)\neq 0$.

It follows, that:

If n is uneven, $f$ does not have a local extremum in $x_0$.

If n is even, $f$ has a local extremum in $x_0$ (maximum if $f^{(n)}(x_0)<0$ and minimum if $f^{(n)}(x_0)<0$).

$\underline{\text{Proof}}$ (From the Taylor-Formula):

$f(x)=f(x_0)+\frac{f^{(n)}(x_0)}{n!}(x-x_0)^n+R_n(x,x_0)$ with $\lim_{x\to x_0 }\frac{R_n(x,x_0)}{(x-x_0)^n}=0$. (Which we can assume to have been proven).

It follows, that for sufficiently small differences between $x$ and $x_0$:

$|\frac{R_n(x,x_0)}{(x-x_0)^n}|<\frac{1}{2}|\frac{f^{(n)}(x_0)}{n!}|$ (This is the first argument I can't understand).

That's why $\frac{f^{(n)}(x_0)}{n!}(x-x_0)^n+R_n(x,x_0)$ has the same (negative/positive) sign as $\frac{f^{(n)}(x_0)}{n!}(x-x_0)^n$ (Now I get completely lost).

With uneven n, $\frac{f^{(n)}(x_0)}{n!}(x-x_0)^n+R_n(x,x_0)$ changes its sing in $x_0$, which is why we have no local extremum.

With even n, we have

$(x-x_0)^n \geq 0$ and therefore $f(x) \geq f(x_0)$ if $f^{(n)}(x_0)>0$ and therefore a minimum,

or $f(x) \leq f(x_0)$ if $f^{(n)}(x_0)<0$ and therefore a maximum.

Now, I'm really not sure if theres an error in the script I'm reading or if it's just me missing something, but since I can't find that proof anywhere else, I wanted to ask whether anyone has any hints or suggestions on where I could look or a direct explanation.

Best Answer

The idea is the following : look at the definition of an extremum. A point $x_0$ is an extremum if it is either a local minimum or a local maximum.

In other words, in a very small neighbourhood around $x_0$, the quantity $f(x) - f(x_0)$ has the same sign : if it is positive, then $f(x_0)$ is a minimum(because that's the same as saying that $f(x) \geq f(x_0)$ in the neighbourhood) else it is a maximum. If one of these happens then $x_0$ is an extremum.

Therefore, to show that a point is (or is not) an extremum , one merely must study the quantity $f(x) - f(x_0)$ in a small enough neighbourhood of $x_0$. We must essentially study the sign of $f(x) - f(x_0)$ : if it changes sign, then $f(x_0)$ is not an extremum. If it doesn't change sign, then depending upon which sign it takes we'd have a local minimum or maximum.

And so, on to the study of $f(x) - f(x_0)$.


You have mentioned that you know Taylor's theorem. Therefore, under the given conditions on the derivatives, if I write the expansion $f(x) = f(x_0) + \frac{f^{(n)}(x_0)}{n!}(x-x_0) + R_n(x,x_0)$ you should be comfortable with each of the terms, and acknowledge the fact that $\lim_{x \to x_0} \frac{R_n(x,x_0)}{(x-x_0)^n} = 0$.

Therefore, $f(x) - f(x_0) = \frac{f^{(n)}(x_0)}{n!}(x-x_0)^n + R_n(x,x_0)$. The left hand side is what we want to study : with this equality we may now study the sign changes in the right hand side!

For convenience, let us perform the following rewrite of the right hand side: $$ \frac{f^{(n)}(x_0)}{n!}(x-x_0)^n + R_n(x,x_0) = (x-x_0)^n\left(\frac{f^{(n)}(x_0)}{n!}+\frac{R_n(x,x_0)}{(x-x_0)^n}\right) $$

The sign of the product, is the product of the signs! Therefore, we are restricted to the study of the sign of the two quantities on the right hand side.


For $(x-x_0)^n$, we note that the sign depends upon two things : if $n$ is even , then this is positive always. If $n$ is odd, then this is positive if and only if $x> x_0$. To finish analysis, note that in any neighbourhood of $x_0$, there's always a point which is bigger than $x_0$ : so conclusion would be "if $n$ is odd, there's a sign change of this expression : but if $n$ is even, there's no sign change".


For the other, we note that $\lim_{x \to x_0} \frac{R_n(x,x_0)}{(x-x_0)^n}=0$ by the Taylor expansion. Adding the $f^{(n)}$ part in (it's just a constant depending on $x_0$) we get : $$ \lim_{x \to x_0} \frac{f^{(n)}(x_0)}{n!} + \frac{R_n(x,x_0)}{(x-x_0)^n} = \frac{f^n(x_0)}{n!} $$

So we have a limit. Which is excellent, and you'll see why.

In your head, imagine a wild sequence, but still one converging to a positive number, say $0.01$. Can you prove rigorously why it must be positive after some time?

The answer would roughly be this : if the sequence is converging to $0.01$, then we can ensure the sequence goes as close to $0.01$ as we want, by going as far up the sequence as required! Now, if the sequence is very very close to $0.01$, then certainly it must continue to have positive terms, because you dragged it this side of $0$ by going far enough up the sequence.

The same thing happens here : note that $\frac{f^n(x_0)}{n!}$ is not zero, so has a sign. The limit ensures that the quantity , whose limit is being taken, will also get the same sign if we take large enough $N$.

The proof of this fact, I leave you to decipher from either the other answer or by the explanation. Or fine, take $\epsilon = \frac 12\left|\frac{f^{(n)}}{(x-x_0)^n}\right|$ then see what you get in the definition of limit existing.

To conclude, the second quantity we wanted to study will take on the same sign as $f^{n}(x_0)$ (since $n!$ is always positive) in a small enough neighbourhood of $x_0$. So it won't change sign!


And now we can finish :

  • If $n$ is odd, then the first quantity changes sign but the second doesn't : this means in total there is a sign change at $x_0$, so $x_0$ is not an extremum.

  • If $n$ is even, then neither quantity changes sign , so $f(x) -f(x_0)$ doesn't change sign, giving that $x_0$ is an extremum. Of course, then $f(x) -f(x_0)$ has the same sign as $f^{(n)}(x_0)$: so if $f^{(n)}(x_0)$ is positive, this is positive and we have a minima. Else, we have a maxima!