[Math] Estimate error using Taylor Series

sequences-and-series

I was told to use Taylor Series to find the error of this but I am not confident in doing this.

Estimate the error in approximating $\cos x$ by $1-(x^2/2)$ on the interval $[-0.5, 0.5]$.

Best Answer

The Lagrange error bound states that the gap between $f(x)$ and its Taylor expansion in $x_0$ of order $k$ is $$ R_k(x) = \frac{f^{(k+1)}\big(\xi(x)\big)}{(k+1)!} (x-x_0)^{k+1} $$ where $\xi(x)$ is a point lying in the interval $[x_0,x]$ or $[x,x_0]$ (depending whether $x_0\leq x$ or $x_0\geq x$). In your case, $$ 1-\frac{x^2}2 $$ is the 3rd order approximation of $\cos x$ in $x_0=0$ (since $\cos'''(0)=\sin(0)=0$), so $$ R_3(x) ~=~ \cos x - \left(1 - \frac{x^2}2\right) ~=~ \frac{\cos{\big(\xi(x)\big)}}{24}x^4 $$ If $x$ ranges in $[-0.5,0.5]$, then so does $\xi(x)$. Therefore $$ \max_{[0.5,0.5]}{\big|R_3(x)\big|} ~\leq~ \frac{\max_{[0.5,0.5]}{\big|\cos(x)\big|}}{24} \max_{[0.5,0.5]}{\big|x^4\big|} ~=~ \frac{1}{24}\cdot\frac{1}{2^4} ~=~ \frac{1}{384} ~\approx~ 0.0026\ldots $$

Related Question