How does the Taylor Series converge at all points for certain functions

polynomialstaylor expansion

The way my professor defined Taylor polynomials is: the $n^{th}$ degree Taylor polynomial $p(x)$ of $f(x)$ is a polynomial that satisfies $\lim_{x\to 0}{f(x)-p(x) \over x^n} = 0$. This is actually the little-o notation $o(x^n)$, which means $(f(x)-p(x)) \ll x^n$ as $x$ approaches $0$. From this I have got the intuition that Taylor Polynomials work only for $|x| < 1$ because $x^n$ gets smaller as $n$ gets bigger only when $|x| < 1$. And the textbook seemed to agree with my intuition, because the textbook says “Taylor polynomial near the origin” (probably implying $|x| < 1$).

Since Taylor Series is basically Taylor polynomial with $n\to\infty$, I intuitively thought that the Taylor Series would also only converge to the function it represents in the interval $(-1, 1)$.

For example, in the case of $1\over1-x$, it is well known that the Taylor series only converges at $|x| < 1 $.

However, all of a sudden, the textbook says that the Taylor series of $\cos x$ converges for all real $x$. It confused me because previously I thought the Taylor series would only work for $|x|<1$. Now, I know that the Taylor Series is defined like this:
$$ f(x) = Tf(x) \Leftrightarrow \lim_{n\to\infty}R_{n}f(x) = 0 $$

And I know how to get the maximum of Taylor Remainder for $\cos x$ using Taylor's Theorem, and I know that the limit of that Taylor Remainder is $0$ for all real $x$, which makes the Taylor Series of $cosx$ converge to $\cos x$, pointwise. However, I just can't get why my initial intuition is wrong (why taylor series converges for all $x$ for certain functions, like $\cos x$, also $\sin x$ and $e^x$, etc.)

Best Answer

Actually, things may go wrong in $(-1,1)$. For instance, the Taylor series centered at $0$ of $f(x)=\frac1{1-nx}$ only converges to $f(x)$ on $\left(-\frac1n,\frac1n\right)$. And if$$f(x)=\begin{cases}e^{-1/x^2}&\text{ if }x\ne0\\0&\text{ if }x=0,\end{cases}$$then the Taylor series of $f$ only converges to $f(x)$ if $x=0$.

On the other hand, yes, Taylor series centered at $0$ are made to converge to $f(x)$ near $0$. But that's no reason to expect that they don't converge to $f(x)$ when $x$ is way from $0$. That would be like expecting that a non-constant power series $a_0+a_1x+a_2x^2+\cdots$ takes larger and larger values as the distance from $x$ to $0$. That happens often, but $1-\frac1{2!}x^2+\frac1{4!}x^4-\cdots=\cos(x)$, which is bounded.