Misunderstanding about Taylor series

taylor expansion

Suppose I have some nice infinitely differentiable function $f$. Lets denote by $T_{n,a}$ Taylor polynomial of $f$ at $a$ of order $n$.

$$T_{n,a}(x) = a_0+a_1(x-a)+a_2(x-a)^2+…+a_n(x-a)^n$$
where $$a_k = \frac{f^{(k)}(a)}{k!}$$

Am I right that Taylor polynomial does good job of approximating function $f$ only in the neighborhood of $a$?

According to Weierstrass theorem any continuous function on the closed interval $[a,b] $can be approximated as closely as desired by polynomial functions.

Suppose then $f(x) = b_0+b_1x+b_2x^2+…$ where right side is infinite sum given by Weierstrass theorem.

And by theorem right side approximates function $f$ very well on the whole interval $[a,b]$

If I wanted to find the coefficients $b_k$ I would proceed just like finding Taylor coefficients and I would find that $b_k = \frac{f^{(k)}(0)}{k!} $

But polynomial with coefficients $\frac{f^{(k)}(0)}{k!} $ is Taylor polynomial of $f$ at $0$.

So I dont understand does Taylor polynomial good job of approximating function $f$ in the neighborhood of $0$ or at the whole interval $[a,b]$?

Best Answer

The approximation given by Weierstrass theorem is not based on Taylor's polynomials. Even in the case of an infinitely differentiable functions, it is possible for its Taylor's polynomial at one point to completely "forget" about the function. The best know example is due to Cauchy: take $f(x)=e^{-1/x^2}$ for $x\neq 0$ and $f(0)=0$. This function is infinitely differentiable at zero yet its Taylor's series at zero is identically zero, thus different from the function at any point except for $x=0$.

What you need is analyticity. You have to make sure (in case you want to stick to Taylor's polynomial) that the remainder in Taylor's formula converges to zero at all points of your interval as $n\to\infty$.

You can also use Bernstein's polynomials (the one used in one of the standard proofs of Weierstrass' Theorem) to approximate your function.

Related Question