[Math] Prove an inequality (Using Taylor expansion)

calculusderivativespolynomialsreal-analysistaylor expansion

Prove: $\frac{x}{1+x} < \ln(1+x) < x$.

I thought a good practice would be to prove it using Taylor Expansion.

Here's my try:
$$\ln(1+x) = x – \frac{x^2}{2} + \frac{x^3}{3}…$$

The n=1 Taylor polynomial is:
$$T_1(x) = x$$
and
$$ ln(1+x) = T_1(x) + R_1(x)$$

Lets evaluate $R_1(x)$ by Cauchy's remainder formula:

$$R_1(x) = \frac{f^{(2)}(\xi)}{2!}\cdot x^2 = \frac{\frac{-1}{(\xi+1)^2}}{2!}\cdot x^2 = \frac{-x^2}{2(\xi+1)^2} < 0$$

Now, it does prove the right-hand side because $x + R_1(x) < x$ ($R_1(x)$ is negative).
I'm not so sure what should I do for the left-hand side. I'd also like to get general critique for my current work.

Thanks!

Best Answer

We apply the mean value theorem on the function $t\mapsto \ln t$ on the interval $[1,1+x]$: there's $\zeta\in(1,1+x)$ such that $$\ln(1+x)=\frac x\zeta$$ and notice that $$\frac1{1+x}<\frac1\zeta<1$$

Related Question