Newton polynomial divided differences

numerical methods

When trying to interpolate data points $(t_i,y_i)$, one can use Newton's polynomials which are spanned by the basis of $P_{n-1}$ given by polynomials of the form

$$ t-t_1, (t-t_1)(t-t_2), …, \prod_{j=1}^{n-1} (t-t_j) $$

So, in the newton basis, an interpolant looks as

$$ \phi_{n-1}(t) = \alpha_1 + \alpha_2(t-t_1) + … + \alpha_n (t-t_1)(t-t_2)…(t-t_{n-1})$$

now, one can do $\phi_{n-1}(t_i) = y_i $ to find $\alpha_i$ which requires to solve a lower triangular matrix.

Now, if we introduce the notion of divided differences,

$$ f[t_k] = y_k, f[t_1,t_2,…,t_k] = \frac{ f[t_2,t_3,…,t_k] – f[t_1,t_2,…,t_{k-1}] }{t_k-t_1} $$

observe that

$$ \phi_{n-1}(t_1) = \alpha_1 = y_1 = f[t_1] $$

Also,

$$ \phi_{n-1}(t_2) = \alpha_1 + \alpha_2 (t_2 – t_1) = y_2 $$

since $\alpha_1 = y_1$, we observe that $\alpha_2 = \dfrac{ y_2 – y_1 }{t_2 – t_1} = \dfrac{ f[t_2]-f[t_1]}{t_2-t_1} = f[t_1,t_2]$

So, we can claim that in general, $\alpha_k = f[t_1,…,t_k]$. Im trying to prove this by induction but I dont see a plan on how to use the induction hypothesis to check for $\alpha_{k+1}$. We can assume that result holds for $k$ so that we can write the interpolant $\phi_k(t)$ as

$$ \phi_k(t) = f[t_1] + f[t_1,t_2](t-t_1) + ..+ f[t_1,…,t_k] (t-t_1)…(t-t_{k-1}) + \alpha_{k+1} (t-t_1)…(t-t_{k}) $$

but, then how can we take from there?

Best Answer

The subject is better caught using a matricial reperesentation, and it is preferrable to index from $0$, as it provides a neater handling of the sums and products.

Given an assigned sequence $\left\{ {t_{\,0} ,t_{\,1} , \cdots ,t_{\,h} , \cdots } \right\}\quad \left| {\;t_{\,k} \ne t_{\,j} } \right.$, let's indicate by $\pi _{\,n} (t)$ the $n$-th degree polynomial $$ \pi _{\,n} (x,{\bf t}) = \prod\limits_{0\, \le \,j\, \le \,n-1} {\left( {x - t_{\,j} } \right)} $$ so that, with the convention of the null product, we have $$ \pi _{\,0} (x,{\bf t}) = 1\quad \pi _{\,1} (x,{\bf t}) = \left( {x - t_{\,0} } \right)\quad \cdots $$

We want to construct a polynomial of degree $h$ such that at the $h+1$ points $(t_0, \cdots,t_h)$ it takes the values $(f_0, \cdots,f_h)$, that is $$ p_{\,h} \left( {x,{\bf t},{\bf f}} \right)\;:\;\;p_{\,h} \left( {t_{\,k} ,{\bf t},{\bf f}} \right) = f_{\,k} \quad \left| {\,0 \le k \le h} \right. $$ While we could express it in terms of Lagrange polynomials, we want instead
to express it as a linear combination of the $\pi _{\,k} (x,{\bf t})$ polynomials $$ \eqalign{ & p_{\,h} \left( {x,{\bf t},{\bf f}} \right) = \left( {\matrix{ {\pi _{\,0} (x,{\bf t})} & {\pi _{\,1} (x,{\bf t})} & \cdots & {\pi _{\,h} (x,{\bf t})} \cr } } \right)\left( {\matrix{ {\beta _{\,0} } \cr {\beta _{\,1} } \cr \vdots \cr {\beta _{\,h} } \cr } } \right) = \cr & = \overline {{\bf \pi }_{\,h} } (x,{\bf t})\;{\bf \beta }_{\,h} \cr} $$ note that the vector ${\bf \pi }$ includes polynomials of varying degree.

Since the above holds for whichever value of $x$, it shall hold in particular for each $x=t_k$, and we get the LT matrix equation you already know $$ \bbox[lightyellow] { \eqalign{ & \left( {\matrix{ {f_{\,0} } \cr {f_{\,1} } \cr \vdots \cr {f_{\,h} } \cr } } \right) = \left( {\matrix{ 1 & 0 & \cdots & 0 \cr 1 & {\left( {t_1 - t_0 } \right)} & \cdots & 0 \cr \vdots & \vdots & \ddots & \vdots \cr 1 & {\prod\limits_{0\, \le \,j\, \le \,0} {\left( {t_h - t_{\,j} } \right)} } & \cdots & {\prod\limits_{0\, \le \,j\, \le \,h - 1} {\left( {t_h - t_{\,j} } \right)} } \cr } } \right)\left( {\matrix{ {\beta _{\,0} } \cr {\beta _{\,1} } \cr \vdots \cr {\beta _{\,h} } \cr } } \right) = \cr & = {\bf f} = {\bf P}({\bf t})\;{\bf \beta } \cr} }\tag{1}$$

Consider now that the Divided Difference of a sequence $y_k$ wrt $x_k$ $$ \left\{ \matrix{ \left[ {y_{\,q} } \right] = y_{\,q} \hfill \cr \left[ {y_{\,q} ,\, \ldots ,\,y_{\,q + n} } \right] = {{\left[ {y_{\,q + 1} ,\, \ldots ,\,y_{\,q + n} } \right] - \left[ {y_{\,q} ,\, \ldots ,\,y_{\,q + n - 1} } \right]} \over {x_{\,q + n} - x_{\,q} }} \hfill \cr} \right. $$ can be expressed as a linear combination of the $y_k$ values as follows $$ \bbox[lightyellow] { \left[ {y_{\,q} ,\, \ldots ,\,y_{\,q + n} } \right] = \sum\limits_{0\, \le \,m\, \le \,n} {{{y_{m + q} } \over {\prod\limits_{0\, \le \,k\, \ne \,m\, \le \,n} {\left( {x_{\,m + q} - x_{\,k + q} } \right)} }}} }\tag{2}$$ because in fact, taking $q=0$ wlog, it is $$ \eqalign{ & \left[ {y_{\,1} ,\, \ldots ,\,y_{\,n + 1} } \right] - \left[ {y_{\,0} ,\, \ldots ,\,y_{\,n} } \right] = \cr & = \sum\limits_{0\, \le \,m\, \le \,n} {{{y_{m + 1} } \over {\prod\limits_{0\, \le \,k\, \ne \,m\, \le \,n} {\left( {x_{\,m + 1} - x_{\,k + 1} } \right)} }}} - \sum\limits_{0\, \le \,m\, \le \,n} {{{y_m } \over {\prod\limits_{0\, \le \,k\, \ne \,m\, \le \,n} {\left( {x_{\,m} - x_{\,k} } \right)} }}} = \cr & = \sum\limits_{1\, \le \,m + 1\, \le \,n + 1} {{{y_{m + 1} } \over {\prod\limits_{1\, \le \,k + 1\, \ne \,m + 1\, \le \,n + 1} {\left( {x_{\,m + 1} - x_{\,k + 1} } \right)} }}} - \sum\limits_{0\, \le \,m\, \le \,n} {{{y_m } \over {\prod\limits_{0\, \le \,k\, \ne \,m\, \le \,n} {\left( {x_{\,m} - x_{\,k} } \right)} }}} = \cr & = \sum\limits_{1\, \le \,m\, \le \,n + 1} {{{y_m } \over {\prod\limits_{1\, \le \,k\, \ne \,m\, \le \,n + 1} {\left( {x_{\,m} - x_{\,k} } \right)} }}} - \sum\limits_{0\, \le \,m\, \le \,n} {{{y_m } \over {\prod\limits_{0\, \le \,k\, \ne \,m\, \le \,n} {\left( {x_{\,m} - x_{\,k} } \right)} }}} = \cr & = \sum\limits_{1\, \le \,m\, \le \,n} {{{y_m } \over {\prod\limits_{1\, \le \,k\, \ne \,m\, \le \,n + 1} {\left( {x_{\,m} - x_{\,k} } \right)} }}} + {{y_{n + 1} } \over {\prod\limits_{1\, \le \,k\, \le \,n} {\left( {x_{\,n + 1} - x_{\,k} } \right)} }} - {{y_0 } \over {\prod\limits_{1\, \le \,k\, \le \,n} {\left( {x_{\,0} - x_{\,k} } \right)} }} - \sum\limits_{1\, \le \,m\, \le \,n} {{{y_m } \over {\prod\limits_{0\, \le \,k\, \ne \,m\, \le \,n} {\left( {x_{\,m} - x_{\,k} } \right)} }}} = \cr & = - {{y_0 } \over {\prod\limits_{1\, \le \,k\, \le \,n} {\left( {x_{\,0} - x_{\,k} } \right)} }} + \sum\limits_{1\, \le \,m\, \le \,n} {y_m \left( {{1 \over {\prod\limits_{1\, \le \,k\, \ne \,m\, \le \,n + 1} {\left( {x_{\,m} - x_{\,k} } \right)} }} - {1 \over {\prod\limits_{0\, \le \,k\, \ne \,m\, \le \,n} {\left( {x_{\,m} - x_{\,k} } \right)} }}} \right)} + {{y_{n + 1} } \over {\prod\limits_{1\, \le \,k\, \le \,n} {\left( {x_{\,n + 1} - x_{\,k} } \right)} }} = \cr & = - {{\left( {x_{\,0} - x_{\,n + 1} } \right)y_0 } \over {\prod\limits_{1\, \le \,k\, \le \,n + 1} {\left( {x_{\,0} - x_{\,k} } \right)} }} + \sum\limits_{1\, \le \,m\, \le \,n} {y_m \left( {{{\left( {x_{\,m} - x_{\,0} } \right)} \over {\prod\limits_{0\, \le \,k\, \ne \,m\, \le \,n + 1} {\left( {x_{\,m} - x_{\,k} } \right)} }} - {{\left( {x_{\,m} - x_{\,n + 1} } \right)} \over {\prod\limits_{0\, \le \,k\, \ne \,m\, \le \,n + 1} {\left( {x_{\,m} - x_{\,k} } \right)} }}} \right)} + {{\left( {x_{\,n + 1} - x_{\,0} } \right)y_{n + 1} } \over {\prod\limits_{0\, \le \,k\, \le \,n} {\left( {x_{\,n + 1} - x_{\,k} } \right)} }} = \cr & = \left( {x_{\,n + 1} - x_{\,0} } \right)\sum\limits_{0\, \le \,m\, \le \,n + 1} {{{y_m } \over {\prod\limits_{0\, \le \,k\, \ne \,m\, \le \,n + 1} {\left( {x_{\,m} - x_{\,k} } \right)} }}} = \cr & = \left( {x_{\,n + 1} - x_{\,0} } \right)\left[ {y_{\,0} ,\, \ldots ,\,y_{\,n + 1} } \right] \cr} $$

Therefore, in matrix notation we will have $$ \bbox[lightyellow] { \eqalign{ & \left( {\matrix{ {\left[ {f_{\,0} } \right]} \cr {\left[ {f_{\,0} ,f_{\,1} } \right]} \cr \vdots \cr {\left[ {f_{\,0} , \ldots ,f_{\,h} } \right]} \cr } } \right) = \left( {\matrix{ 1 & 0 & \cdots & 0 \cr {{1 \over {\left( {t_{\,0} - t_{\,1} } \right)}}} & {{1 \over {\left( {t_{\,1} - t_{\,0} } \right)}}} & \cdots & 0 \cr \vdots & \vdots & \ddots & \vdots \cr {{1 \over {\prod\limits_{0\, \le \,k\, \ne \,0\, \le \,h} {\left( {t_{\,0} - t_{\,k} } \right)} }}} & {{1 \over {\prod\limits_{0\, \le \,k\, \ne \,1\, \le \,h} {\left( {t_{\,1} - t_{\,k} } \right)} }}} & \cdots & {{1 \over {\prod\limits_{0\, \le \,k\, \ne \,h\, \le \,h} {\left( {t_{\,h} - t_{\,k} } \right)} }}} \cr } } \right) \left( {\matrix{ {f_{\,0} } \cr {f_{\,1} } \cr \vdots \cr {f_{\,h} } \cr } } \right) = \cr & = {\bf D}_{\,{\bf div}} ({\bf t})\;{\bf f} \cr} }\tag{3}$$

The crucial point is that $$ \bbox[lightyellow] { {\bf D}_{\,{\bf div}} ({\bf t}) = {\bf P}({\bf t})^{\, - \,{\bf 1}} }\tag{4}$$ which in connection with (1) demonstrates the thesis.

In fact $$ \eqalign{ & \left[ {y_{\,0} ,\, \ldots ,\,y_{\,n} } \right]\quad \left| {\;y = f(x) = \prod\limits_{0\, \le \,j\, \le \,g - 1} {\left( {x - x_{\,j} } \right)} } \right.\quad = \cr & = \sum\limits_{0\, \le \,m\, \le \,n} {{{f\left( {x_m } \right)} \over {\prod\limits_{0\, \le \,k\, \ne \,m\, \le \,n} {\left( {x_{\,m} - x_{\,k} } \right)} }}} = \cr & = \left[ {g \le n} \right]\sum\limits_{g\, \le \,m\, \le \,n} {{{f\left( {x_m } \right)} \over {\prod\limits_{0\, \le \,k\, \ne \,m\, \le \,n} {\left( {x_{\,m} - x_{\,k} } \right)} }}} = \cr & = \left[ {g \le n} \right]\sum\limits_{g\, \le \,m\, \le \,n} {{{\prod\limits_{0\, \le \,j\, \le \,g - 1} {\left( {x_{\,m} - x_{\,j} } \right)} } \over {\prod\limits_{0\, \le \,k\,\, \le \,g - 1} {\left( {x_{\,m} - x_{\,k} } \right)} \prod\limits_{g\, \le \,k\, \ne \,m\, \le \,n} {\left( {x_{\,m} - x_{\,k} } \right)} }}} = \cr & = \left[ {g \le n} \right]\sum\limits_{g\, \le \,m\, \le \,n} {{1 \over {\prod\limits_{g\, \le \,k\, \ne \,m\, \le \,n} {\left( {x_{\,m} - x_{\,k} } \right)} }}} = \cr & = \left[ {g \le n} \right]\sum\limits_{0\, \le \,m\, \le \,n - g} {{1 \over {\prod\limits_{0\, \le \,k\, \ne \,m\, \le \,n - g} {\left( {x_{\,m + g} - x_{\,k + g} } \right)} }}} = \cr & = \left[ {g \le n} \right]\left[ {1_{\,g} ,\, \ldots ,\,1_{\,g + \left( {n - g} \right)} } \right] = \left[ {g = n} \right] \cr} $$ where the second step follows from being $f(x_m)=0$ for $m<g$ and where $[g \le n],\,[g = n]$ denotes the Iverson bracket.

Related Question