Application of Mean-Value Theorem

derivativesreal-analysis

I wish to prove that the following equation $$x^2 +3x = \cos(x)$$

has precisely two roots. Since this is equivalent to the quadratic equation $$x^2+3x – \cos (x) = 0$$

let $f(x) = x^2 +3x – \cos(x)$. Because of the quadratic term, this equation has precisely two roots, which is equivalent to the statement : $f(x)$ intersects the horizontal axis at two points. Let $f(a),f(b)$ denote those points.

$f$ is continuous everywhere and differentiable everywhere, including some arbitrary interval $[a,b],(a,b)$. According to the mean value theorem, there exists a point $c\in (a,b)$ such that $$f'(c)=\frac{f(b)-f(a)}{b-a}=\frac{0-0}{b-a}=0$$

We can then apply Rolle's theorem to conclude the proof. Is it necessary in this case to restric the interval $[a,b]$ into some appropriate sub-interval of $\Bbb{R}$? I was thinking $[-\pi, \pi]$ initially, but then didn't see the point of it.

Best Answer

One thing I shall point: $x^2 + 3x - \cos(x) = 0$ is NOT a quadratic equation. Thus, your statement:

Because of the quadratic term, this equation has precisely two roots

is incorrect.

To convince you better, consider the "quadratic equation": $x^2 - 2^x = 0.$ This happens to have $3$ roots. ($2$ and $4$ are clear roots. I leave it to you to show the existence of a negative root.)

Moreover, it isn't even necessary that an actual quadratic equation has two (distinct real) roots. For example, consider: $x^2 = 0$ and $x^2 + 1 = 0$.


The actual solution:
We do this in two parts:

Part 1. Showing that there exist at least 2 roots.
Let $f(x) = x^2 + 3x - \cos(x)$. Note that $f(100) > 0$ and $f(-100) > 0$. Also, note that that $f(0) < 0$. Thus, by the intermediate value theorem, $f$ has at least two roots, one in $(-100, 0)$ and one in $(0, 100)$.

Part 2. Showing that there are at most 2 roots.
Suppose that there are three distinct real roots: $a, b, c$. We shall arrive at a contradiction.

WLOG, we may assume that $a < b < c$.
Now, we have $f(a) = f(b)$ and thus, by Rolle's theorem, we have that $f'(\alpha) = 0$ for some $\alpha \in (a, b)$.
We similarly have $f'(\beta) = 0$ for some $\beta \in (b, c)$.

Now, applying Rolle's theorem to $f'$ on $(\alpha, \beta)$, we see that there exists $\gamma \in (\alpha, \beta)$ such that $f''(\gamma) = 0.$ However, we note that:

$$f''(\gamma) = 2 + \cos(\gamma) \ge 2 + (-1) = 1 > 0,$$

a contradiction.


I leave the justification of usage of the theorems up to you. (Namely, intermediate value theorem and Rolle's theorem.)

Related Question