Arrive at this sufficient condition for convergence of Taylor Series

calculusconvergence-divergencereal-analysistaylor expansion

So I have been studying the convergence of Taylor series from Tom M Apostol. There is a theorem which states the sufficient condition for convergence of Taylor series. Quoting the theorem we've

Assume $f$ is infinitely differentiable in an open interval $I = (a –
r, a + r)$
, and assume that there is a positive constant $A$ such
that $$ |f^{(n)}(x)| \le A^n$$ for $n = 1, 2, 3, \dots$ and every $x$
in $I$. Then the Taylor's series generated by $f$ at $a$ converges to
$f(x)$ for each $x$ in $I$.

My question is how to arrive at $|f^{(n)}(x)| \le A^n$ deductively ? I tried looking it up on the internet but haven't found anything that solves my issue.

Thanks!

Best Answer

The assertion $$ |f^n(x)| \le A^n $$ is a hypothesis in this theorem. You can't "arrive at it deductively" in general. When it happens to be true for any particular function $f$ then you can conclude something about the Taylor series for that function.

For example, the $n$th derivatives of $\sin$ are $\sin$ and $\cos$. Their absolute values are always at most $1$. This theorem then implies that the Taylor series for $\sin$ converges everywhere to the value of the function.

Related Question