$h$ in Textbook Definition of Taylor’s theorem for One Variable: $f(x_0 + h) = \dots$

real-analysistaylor expansion

When discussing Taylor's theorem for one variable, my textbook says the following:

For a smooth function $f : \mathbb{R} \to \mathbb{R}$ of one variable, Taylor's theorem asserts that

$$f(x_0 + h) = f(x_0) + f'(x_0) \cdot h + \dfrac{f''(x_0)}{2} \cdot h^2 + \dots + \dfrac{f^{(k)}(x_0)}{k!} \cdot h^k + R_k (x_0, h),$$

where

$$R_k (x_0, h) = \int_{x_0}^{x_0 + h} \dfrac{(x_0 + h – \tau)^k}{k!} f^{k + 1}(\tau) \ d \tau$$

is the remainder. For small $h$, this remainder is small to order $k$ in the sense that

$$\lim_{h \to 0} \dfrac{R_k(x_0, h)}{h^k} = 0$$

In other words, $R_k(x_0, h)$ is small compared to the already small quantity $h^k$.

I've never seen this version of Taylor's theorem, where the $h$ is added into the equation. Looking at the Wikipedia article for Taylor's theorem, it seems that this definition of Taylor's theorem is uncommon?

Reading my textbook's definition of Taylor's theorem, I don't understand what the $h$ is supposed to represent or how it intuitively fits into Taylor's theorem? I would greatly appreciate it if people could please take the time to clarify this.

Best Answer

You're probably used to a form that says $$ f(x) = f(x_0) + f'(x_0) (x - x_0) + \frac{f''(x_0)}{2} \cdot (x-x_0)^2 + \ldots $$ which tells you that if you take a number $x$ near $x_0$, you can compute $f(x)$ using data from $f$ and its derivatives at $x_0$, more or less.

The version you've written in your question says "If you move a distance $h$ away from $x_0$, to the point $x = x_0 + h$, you can compute $f(x_0 + h)$ using data from $f$ and its derivatives at $x_0$."

In short: if you replace $h$ by $x - x_0$ (or, alternatively, define $x$ to be $x_0 + h$), you'll find that the form I've written down becomes the form you're used to seeing.

Related Question