For any $n\geq 2$ degree polynomial $f$, local extrema have a left/right neighborhood that is strictly increasing/decreasing or decreasing/increasing

calculuspolynomialssolution-verification

I want to prove that for every $2^{\text{nd}}$ degree or larger polynomial $f$, every local minimum/maximum point $a$ has the following property:

There is some $\mu_L \gt 0$ such that on the interval $[a-\mu_L,a]$, $f$ is strictly decreasing/increasing (for min/max, respectively)

There is some $\mu_R \gt 0$ such that on the interval $[a,a+\mu_R]$, $f$ is strictly increasing/decreasing (for min/max respectively).

I could only prove this by contradiction. Any suggestions on easier methods would be greatly appreciated.


My argument will make use of the following claims which have straightforward proofs:

If $f$ is a $1^{\text{st}}$ degree or larger polynomial, then for any $x_1 \lt x_2$ in $\mathbb R$, there are $y_1, y_2$ in $[x_1,x_2]$ such that $y_1 \neq y_2$. $(\dagger)$

That is to say, a $1^{\text{st}}$ degree or larger polynomial has no constant segments.

If $f$ is a $1^{\text{st}}$ degree or larger polynomial, then there exists a finite number $k$ of local extrema. $(\dagger \dagger)$


Without loss of generality, I will assume that $a$ is a local minimum point to the $2^{\text{nd}}$ degree or larger polynomial $f$ and demonstrate that there must be an interval $[a, a +\mu_R]$ where $f$ is strictly increasing (the proof of the strictly decreasing left interval works similarly, and, by extension, so does the proof for when $a$ is a local maximum point).

By definition, $a$ being a local minimum point means there is a neighborhood $(a-\delta, a+ \delta)$, where $\delta \gt 0$, such that for any $x \in (a-\delta, a+ \delta)$, $f(x) \geq f(a) \quad \text{def}_1$ .

Suppose by contradiction, no strictly increasing interval $[a,a+\mu_R]$ exists. This means that for any $x \gt a$, there are $y_1 \lt y_2 \in [a,x]$ such that $f(y_1) \geq f(y_2)$. By $(\dagger\dagger)$, we know that there are a finite number of other local extrema. Therefore, all such local extrema points can be listed…as can their distances from $a$. Choose the local extrema point $b$ closest to $a$ and let $\delta' = |a-b|$. Let $\delta^* = \min\left(\delta,\delta^* \right)$, where $\delta$ refers to $\text{def}_1$.

Choose an arbitrary $x \in [a, a+\delta^*)$. By assumption, we know there are a $y_1 \lt y_2 \in [a,x]$ such that $f(y_1) \geq f(y_2)$. Because $y_1,y_2 \in [a,x] \subset [a,a+\delta^*)$, $f(y_1) \geq f(a)$ and $f(y_2) \geq f(a)$. Suppose $f(y_1)=f(y_2)=f(a)$.

Although $y_1=a$ is a possibility, $y_2 \gt a$. Further, $y_2 \lt b$. Because $f$ cannot take on values less than $f(a)$ on the interval $[a,y_2]$, $f$ is either constant across this interval or transiently takes on values larger than $f(a)$ before returning to $f(y_2)$. However, by $\dagger$, $f$ has no constant segments. Therefore, $f$ must transiently increase after the point $a$ before, at some point, eventually dropping back down as it approaches $y_2$. However, this would imply a change in the sign of the slope, which would mean that a local maximum took place between $a$ and $y_2$. But this is a contradiction because $y_2 \lt b$, the next closest local extrema.

Now, suppose instead, $f(y_2) \gt f(a)$. By assumption, $f(y_1) \geq f(y_2)$, which means $a \lt y_1 \lt y_2$. If $f(y_1)=f(y_2)$, then either $f$ decreases transiently after $y_1$ before increasing as it approaches $y_2$, or $f$ transiently increases after $y_1$ before decreasing as it approaches $y_2$ (or some combination of the two). Remember, constant segments are ruled out. Either way, there is a change in the sign of the slope, which can only happen at an extrema point between $y_1$ and $y_2$…a contradiction (because $a \lt y_1 \lt y_2 \lt b$).

If $f(y_1) \gt f(y_2)$, a similar contradiction will arise in the following manner: $f(a) \lt f(y_1)$ and $f(y_1) \gt f(y_2)$. This means that at some point between $a$ and $y_2$, a change in the sign of slope must take place. $\quad \square$


I feel like there must be an easier way to prove this, so please feel free to contribute.

Best Answer

One can argue as follows: If $a$ is a local minimum of $f$ (a polynomial of degree $\ge 2$) then $f'(a) = 0$. The non-constant polynomial $f'$ has only finitely many zeros, so there is an $r > 0$ such that $f'(x) \ne 0$ for all $x \in [a-r, a+r] \setminus \{ a \}$.

The derivative is also continuous, therefore either $f'(x) > 0$ on $(a, a+r]$ or $f'(x) < 0$ on $(a, a+r]$, which means that $f$ is strictly increasing or strictly decreasing on $[a, a+r]$. Since we assumed that $a$ is a local minimum, only the first case is possible.

The same reasoning shows that $f$ is strictly decreasing on $[a-r, a]$.

Remark: The same conclusion holds for any differentiable function $f$ if $f$ has a minimum at $a$ and $f'$ has an isolated zero at $a$ (i.e. there is an open interval $I$ containing $a$ such that $f'(x) \ne 0$ for all $x \in I \setminus \{ a \}$). The continuity of $f'$ is not needed because derivatives have the intermediate value property.

Related Question