[Math] Trying to use Taylor’s inequality to estimate the accuracy of the approximation on the given interval.

calculus

So I started with these things: $$f(x)={\sqrt{(x)}} \;\;\;\; a=4 \;\;\;\; n=2 \;\;\;\; 4 \le x\le 4.3$$

And then I approximated f using a Taylor polynomial at the degree n and the number a: $$T_2=2+{1\over4}(x-4)-{1\over64}(x-4)^2$$

Now I'm trying to use the Taylor inequailty to estimate the accuracy of the approximation $f(x)=T_n(x)$ where x is in the interval $4 \le x\le 4.3$

The first step would be to get the 3rd integral, so: $$f^{(3)}={3\over8}x^{-5\over3}$$Correct?
Find that it is if increasing or decreasing, here's the graph: enter image description here

It is obvious it is decrasing on that interval, so I will calculate $f^{(3)}={3\over8}(4.3)^{-5\over3}$ and that $= 0.329799891$. Is that the correct way to go about this? Am I doing it incorrectly? Thank you.

Best Answer

You need to consider the remainder of the expansion.

If you had build $T_3$, the result would have been $$T_3=2+\frac{x-4}{4}-\frac{1}{64} (x-4)^2+\frac{1}{512} (x-4)^3+O\left((x-4)^4\right)$$ and for $T_2$ the remainder is just $$|R_2|=\frac{1}{512} (x-4)^3$$ So, for $x=4.3$, $R_2\approx 0.00005$.