[Math] Estimating accuracy of Taylor series approximations with 2 bounds

approximationcalculusinequalitypower seriessequences-and-series

I have a question from a previous exam as such:

Use Taylor's Inequality to estimate the accuracy of the approximation $f(x) \approx T_{3}(x)$ when $0.8 \leq x \leq 1.2$.

I computed from an earlier step in the problem that $T_{3}(x) = 1+\frac{2(x-1)}{3}-\frac{1}{9}(x-1)^2+\frac{4}{81}(x-1)^3$.

So how do I use Taylor's Inequality on this? I've only ever seen Taylor's Inequality work on one remainder term, like this.

EDIT: The original function is $f(x)=x^{\frac{2}{3}}$ centered at $a=1$.

Best Answer

Since $0.8=1-0.2, \; 1.2 = 1+0.2$ are symmetric around $x=1$ this is the usual situation. The error estimate from the Taylor approximation is

$$R_3 \le \frac{M |x-1|^4}{4!}$$

where $M$ is the maximum of $f^{(4)}(x)$ in the interval $(0.8,1.2),\;$ here $M=f^{(4)}(0.8)\approx 1.455$. Therefore the error is $$R_3 \le \frac{1.455 |0.2|^4}{4!}\approx 9.7\cdot10^{-5}$$ Plotting $f(x)-T_3(x)$ I get an actual error of $\approx 5.3\cdot10^{-5}$