Existence of Taylor series and Analyticity

analytic-functionsanalyticityreal-analysissmooth-functionstaylor expansion

I have realised my understanding of Taylor series is not as complete as I would like it to be and therefore have formulated some questions which I am struggling to find answers that I can understand:

  1. Is the following statement true: a function is analytic at a point $\iff$ its Taylor series exists at that point?
  2. I know that not all smooth functions have a Taylor expansion but for the functions that I have seen I don't really understand why, e.g. $f(x)=e^{−1/x^2}, x>0, f(x)=0, x⩽0$.
  3. On Wikipedia it says, "A function may differ from the sum of its Taylor series, even if its Taylor series is convergent." When would this be the case and why does it not converge to the function?
  4. If at a point the first derivative exists can an infinite amount of derivatives be taken?
  5. If a function is analytic does it mean that it has a Taylor series at every point?

I know that there are quiet a few questions here so help with even one would be greatly appreciated.

Best Answer

  1. No. A function $f(x)$ is analytic at a point $x_0$ if its Taylor series exists at $x_0$ and converges to $f(x)$ in a neighborhood of $x_0$. The Taylor series may exist but not converge, or it may converge but not to $f(x)$.

  2. The basic idea is that $f(x)$ decreases so quickly as $x \to 0$ that all of its derivatives exist and are equal to $0$ at $x = 0$; more formally we have $\lim_{x \to 0} \frac{f(x)}{x^m} = 0$, which can be established e.g. by taking logarithms. So the Taylor series at $x = 0$ has all coefficients zero, which means it doesn't converge to $f(x)$ in any neighborhood of $0$. On the other hand, for $x > 0$, $f(x)$ is a composition of smooth functions and hence smooth.

  3. The previous function is an example; the Taylor series at $x = 0$ converges to zero in any neighborhood but the function doesn't. Generally, the way we prove that a Taylor series converges (when it does) is to use Taylor's theorem with remainder, which bounds the error of the approximation given by taking the first $n$ terms of the Taylor series. But it can happen that this bound doesn't go to $0$ as $n \to \infty$; if that's the case then Taylor's theorem doesn't tell us that the Taylor series converges to the function.

  4. Not necessarily. For example, the function $f(x) = x^k |x|$ is differentiable at $x = 0$ exactly $k$ times, but not $k+1$ times.

  5. Yes; more precisely it means that at every point the Taylor series exists and converges to $f(x)$ in some neighborhood.

As an additional comment on question 2, if a function $f(x)$ is analytic at $x = 0$ then it must decay at worst like $x^m$ as $x \to 0$, where $m$ is the first index such that $f^{(m)}(0) \neq 0$. So analytic functions can't decay faster than polynomially at any point.

Related Question