Real Analysis – Best Polynomial Approximation of a Function Around a Point

approximationcalculusreal-analysistaylor expansion

I think I understand how to use Taylor polynomials to approximate sine. For instance, if
$$
\sin x \approx ax^2+bx+c
$$

and we want the approximation to be particularly accurate when $x$ is close to $0$, then we could adopt the following approach. When $x=0$, $\sin x = 0$, and so $ax^2+bx+c=0$, meaning that $c=0$. Therefore we get
$$
\sin x \approx ax^2+bx
$$

If we want the first derivatives to match, then $\frac{d}{dx}(ax^2+bx)$ should equal $1$. Therefore, $b=1$:
$$
\sin x \approx ax^2+x
$$

Finally, if we want the second derivatives to match, then $\frac{d^2}{dx^2}(ax^2+x)$ should equal $0$, and so $a=0$. The small angle approximation for sine is
$$
\sin x \approx x
$$

All of this makes sense to me. What I don't understand is when people try to put this on rigorous footing. I have often heard people say 'this shows that $x$ is the best quadratic approximation of $\sin x$ when $x$ is near to $0$'. But what is meant by 'best', and 'near'? If the approximation suddenly became terrible when $x=0.5$, then would this be considered close enough to $0$ for there to be a problem? It seems that there are formal definitions for these terms, but I don't know what they are.

Best Answer

Given a function $f$, polynomials $p_1$ and $p_2$, and some $x_0$, we can define "better" as meaning that there is a neighborhood in which it is a better approximation. That is, if there exists $\epsilon$ such that $(|x-x_0|<\epsilon) \rightarrow (|p_1(x)-f(x)|<|p_2(x)-f(x)|)$, then near $x_0$, $p_1$ is a better approximation to $f$ than $p_2$ is.

So Taylor polynomials can, with this definition, be said to be better than any other polynomial with the same order. That is, if $f$ is analytic and $T_n$ is the $n$th order Taylor polynomial of $f$, then for all $n$th order polynomials $g$, there exists a neighborhood around $x_0$ such that $T_n$ is better than $g$.