[Math] Use Taylor’s Theorem with $n=2$ to prove that the inequality $1+x

calculusnumerical methodstaylor expansion

Use Taylor's Theorem with $n=2$ to prove that the inequality $1+x<e^x$ is valid for all $x\in \mathbb{R}$ except $x=0$.

Taylor's Theorem:
$$
f(x)=\sum_{k=0}^n{1\over k!}f^{(k)}(c)(x-c)^k+E_n(x) \\
\text{where}\quad E_n(x)={1\over (n+1)!}f^{n+1}(\xi)(x-c)^{n+1}
$$

I'm not sure how to do this. Any solutions or hints are greatly appreciated. I'm not sure if I'm supposed to even use this error term $E_n(x)$ since when $n=2$, $f(x)=e^x$, and $c=0$ we obtain something like $e^x=1+x+{1\over 2}x^2$ and since ${1\over 2}x^2>0$ for all reals except $x=0$ we see that $1+x<e^x$. I'm just confused about how to approach this problem.

Best Answer

I suffices $n=1$ and $c=0$. In this way you get $e^x=1+x+E_1(x)=1+x+\frac{1}{2}e^\xi x^2$. Note that $\frac{1}{2}e^\xi x^2>0$ for all $\xi$ provided $x\neq 0$