Calculus – Confusion About Taylor Approximation Away from Center Point

calculuspolynomialssequences-and-seriestaylor expansion

I'm trying to learn Taylor expansions and was watching a tutorial here. In the tutorial, Taylor approximation is introduced by first showing Maclaurin series, which is basically taylor series at $x=0$. The introduction seems intuitive to me:

First suppose we want to approximate a function $f(x)$ with polynomials at $x=0$ given derivative of $f(x)$ exists for any order at $x=0$, then we can start with very simple approximation at $x=0$:
\begin{equation}
f(x) \approx f(0)
\end{equation}
then add more and more higher order terms,
\begin{equation}
f(x) \approx f(0) + f'(0)x + f''(0)\frac{x^2}{2} + …
\end{equation}
this makes sense, since the RHS exactly matches the LHS for any order derivatives at $x=0$, here is a picture of taylor approximation of $sin(x)$ at $x=0$ up to order 3: image or use wolfram.

My question is, we are making approximation of $f(x)=sin(x)$ locally at/around $x=0$, but (as you can see from the plot) why as the more higher order terms being added, the approximation also become more and more like $f(x)$ even far away from $x=0$, intuitively why this is the case? Because what I understand is, the approximation is only derived locally around $x=0$. Does this also imply that with enough higher order terms Taylor expansion at $x=0$ and $x=a$ where $a\ne0$ are just the same? Furthermore, with enough terms does taylor series approximate $f(x)$ everywhere and not just $x=0$ anymore?

Best Answer

The Taylor polynomial does not always tell you the value of $x$ everywhere.

There is a radius of convergence. So if you look at the Taylor polynomial of $\ln x$ centered at $a$ it will only converge if $|x-a| < a$ which is kind of obvious as $x$ gets near zero something has to break.

You could say, that if the function is "smooth" i.e. it is infinitely differentiable, then all of its derivatives at any point tell you everything you need to know about the function everywhere inside the radius of convergence.

Anyway, how do you know that the Taylor polynomial of a function will be near that function for all $x$ in the domain?

Taylor showed that for an approximation of any degree, he could estimate the error.

Suppose $T_n(x)$ is the $n^{th}$ degree Taylor polynomial of some function $f(x)$

i.e.

$T_3(x) = x - \frac 16 x^3$ is the third degree Taylor approximation of $\sin x$ centered at 0.

This estimate has some error and that error can be bounded by a polynomial of higher degree:

$|T_k(x) - f(x)| < |M_{k+1} (x-a)^{k+1}|$

And $M_k = \frac {1}{(k+1)!} |f^{(k+1)}(\xi)|$

Where $\xi$ would be the element of the domain that maximizes the absolute value of the $(k+1)^{th}$ derivative.

If $x$ is inside the radius of convergence: $|x-a| < r \implies \lim_\limits {k\to \infty} M_k(x-a)^k = 0$