[Math] Derivative of Lagrange interpolating polynomial

interpolationlagrange-interpolationnumerical methods

I'm using a textbook titled "Numerical Analysis" by Richard L. Burden, 9th edition. I'm having a problem with a particular derivation

The Lagrange interpolating polynomial is given by
$$f(x) = \sum_{k=0}^{n}f(x_k)L_k(x) + \frac{(x-x_0)\cdots (x-x_n)}{(n+1)!}f^{(n+1)}(\epsilon(x))$$

Where the first term is our interpolating function in which we approximate f(x) using the Lagrange polynomials and the second term is our error. $\epsilon$ is some complicated function. The book tries to go further with this function and use it for helping in numerical differentiation. So they then take the derivative

$f'(x) = \sum_{k=0}^{n}f(x_k)L_k'(x) + D_x[\frac{(x-x_0)…(x-x_n)}{(n+1)!}]f^{(n+1)}(\epsilon(x)) + \frac{(x-x_0)…(x-x_n)}{(n+1)!}D_x[f^{(n+1)}(\epsilon(x))]$

But if we choose x to be one of the $x_j$, then the term in front of $D_x[f^{(n+1)}(\epsilon(x))]$ is 0. We then get

$$f'(x_j) = \sum_{k=0}^{n}f(x_k)L'_k(x_j) + \frac{f^{(n+1)}(\epsilon(x_j))}{(n+1)!} \prod_{k=0, k \neq j}^{n}(x_j-x_k)$$

Which he calls the n+1 point formula

But this doesn't seem correct. It looks like they did not take the derivatve, looking at the term $D_x[\frac{(x-x_0)…(x-x_n)}{(n+1)!}]$ it's as if they ignored the derivative

Best Answer

In short, this is product rule. If you are comfortable with the notation for the product rule of $N$ factors (see e.g. Wikipedia), denoted by "$(\text{PR})$" below, you may calculate:

$$\begin{align*}D_x[(x-x_0)\ldots(x-x_n)]\rvert_{x=x_j}&=D_x[\prod_{i=1}^n (x-x_i)]\rvert_{x=x_j}\\ &\overset{\text{(PR)}}{=}\sum_{i=0}^n\Bigg( \underbrace{D_x[x-x_i]\vphantom{\prod_{k=0,k\neq i}^n}}_{=1}\prod_{k=0,k\neq i}^n (x-x_k)\Bigg)\Bigg\rvert_{x=x_j}\\ &=\sum_{i=0}^n\underbrace{\prod_{k=0,k\neq i}^n \underbrace{(x_j-x_k)}_{=0\text{ for }k=j}}_{=0 \text{ for }i\neq j}\\ &=\prod_{k=0,k\neq j}^n (x_j-x_k) \end{align*}$$


Step-by-step approach may look like this:

I guess what happens to the left and the right term in your 2nd display line is clear, right? So we know that

$$\begin{align*}&f'(x_j)=\\ &=\sum_{k=0}^n f(x_k)L_k'(x_j)+\left.\left(D_x[\tfrac{(x-x_0)\ldots(x-x_n)}{(n+1)!}]\right)\right\rvert_{x=x_j}f^{(n+1)}(\epsilon(x))+\underbrace{\tfrac{(x-x_0)\ldots(x-x_n)}{(n+1)!}}_{=0}\left(\left.D_x[f^{(n+1)}(\epsilon(x))]\right)\right\rvert_{x=x_j}\\ &=\sum_{k=0}^n f(x_k)L_k'(x_j)+\left.\left(D_x[\tfrac{(x-x_0)\ldots(x-x_n)}{(n+1)!}]\right)\right\rvert_{x=x_j}f^{(n+1)}(\epsilon(x))\end{align*}$$

and your question is about the remaining term without the $L_k$'s. We notice that $\left.\left(D_x[\tfrac{(x-x_0)\ldots(x-x_n)}{(n+1)!}]\right)\right\rvert_{x=x_j}=\tfrac{1}{(n+1)!}\left.\left(D_x[(x-x_0)\ldots(x-x_n)]\right)\right\rvert_{x=x_j}$ and calculate the derivative by using product rule:

$$\begin{align*}&\left.\left(D_x[(x-x_0)\ldots(x-x_n)]\right)\right\rvert_{x=x_j}\\ \\ =&\left.\bigg(\underbrace{D_x[x-x_0]}_{=1}(x-x_1)(x-x_2)\ldots(x-x_n)\bigg)\right\vert_{x=x_j}+\\ &\left.\bigg((x-x_0)\underbrace{D_x[x-x_1]}_{=1}(x-x_2)\ldots(x-x_n)\bigg)\right\vert_{x=x_j}+\\ &\qquad\ldots\qquad+\\ &\left.\bigg((x-x_0)\ldots(x-x_{n-2})(x-x_{n-1})\underbrace{D_x[x-x_n]}_{=1}\bigg)\right\vert_{x=x_j}\\ \\ =&\bigg(1\cdot(x_j-x_1)(x_j-x_2)\ldots(x_j-x_n)\bigg)+\\ &\bigg((x_j-x_0)\cdot 1\cdot(x_j-x_2)\ldots(x_j-x_n)\bigg)+\\ &\qquad\ldots\qquad+\\ &\bigg((x_j-x_0)\ldots(x_j-x_{n-2})(x_j-x_{n-1})\cdot 1\bigg)\end{align*}$$

If we fix a $j\in \{0,\ldots,n\}$, we observe that all these products contain the factor $(x_j-x_j)$ somewhere (and thus vanish) - except one! The $j$-th product (i.e. the $j$-th summand in the sum above) survives because there's a $1$ where all the other ones have the $(x_j-x_j)$. Thus

$$\begin{align*}&\left.\left(D_x[(x-x_0)\ldots(x-x_n)]\right)\right\rvert_{x=x_j}\\ =&\bigg((x_j-x_0)\ldots(x_j-x_{j-1})\cdot 1\cdot(x_j-x_{j+1})\ldots(x_j-x_n)\bigg)\\ =&\prod_{k=0,k\neq j}^n (x_j-x_k)\end{align*}$$

and consequently

$$\left.\left(D_x[\tfrac{(x-x_0)\ldots(x-x_n)}{(n+1)!}]\right)\right\rvert_{x=x_j}=\frac{1}{(n+1)!}\prod_{k=0,k\neq j}^n (x_j-x_k)$$

which proves

$$f'(x_j) = \sum_{k=0}^{n}f(x_k)L'_k(x_j) + \frac{f^{(n+1)}(\epsilon(x_j))}{(n+1)!} \prod_{k=0, k \neq j}^{n}(x_j-x_k).$$