[Math] $f(x)=1/(1+x^2)$. Lagrange polynomials do not always converge. why

interpolationnumerical methods

Let $f(x) = \frac{1}{1+x^2}$. Error of Interpolation with Lagrange polynomials for $n+1$ points is given by
$$
e(x)=f(x)-P_n(x)=\frac{f^{(n+1)}(\eta_x)}{(n+1)!}\prod_{i=0}^n (x-x_i)
$$

Carl Runge said that for this $f$, Lagrange polynomial $P_n$ do not converge uniformly. How can I show this?

Best Answer

Here is an elementary proof that equispaced Lagrange polynomials on $[-a,a]$ do not converge at $b$ for $a=5$ and $b=4$ (and many other values). We assume an odd number of nodes $2n+1$, so the nodes are at $ak/n$ for $-n \leq k \leq n$; an even number of nodes is similar but notationally a little uglier. Let $p(x)$ be the Lagrange interpolating polynomial. Let $q(x) = \prod_{k=-n}^n (x-ak/n)$, the monic polynomial vanishing at the nodes.

So $p(x)(x^2+1)-1$ vanishes at the nodes, and we thus have $$p(x) (x^2+1)-1 = q(x) r(x) \quad (\ast)$$ for some polynomial $r(x)$. The polynomial $p$ has degree $\leq 2n$ and $q$ has degree $2n+1$, so $\deg r \leq 1$. Also, note that $p(x)$ is an even polynomial (obtained by interpolating an even function at symmetrically placed points) and $q(x)$ is an odd polynomial (a root at $0$, and all other roots mirror symmetric) so $r$ is odd. Thus, $r(x) = c x$ for some constant $c$.

We can compute $c$ by plugging in $x=i$. (If I were teaching this to students who fear complex numbers, I might first write $p(x) = f(x^2)$, $q(x) = x g(x^2)$, set $y=x^2$ and plug in $y=-1$. But here we will be fearless!) $$p(i) \cdot 0 -1 = (c i) \prod_{k=-n}^n (i-ak/n) = (ci) \cdot i \cdot \prod_{k=1}^n (i^2 - a^2 k^2/n^2)$$ $$c =\frac{(-1)^n}{\prod_{k=1}^n (1+a^2 k^2/n^2)}.$$

Rearranging equation $(\ast)$ gives an exact formula for the error: $$p(x) - \frac{1}{1+x^2} = \frac{c x q(x)}{1+x^2} = \frac{(-1)^n x^2 \prod_{k=1}^n (x^2-a^2 k^2/n^2)}{(1+x^2) \prod_{k=1}^n (1+a^2 k^2/n^2)}.$$

We see that the error goes to zero if and only if $$\lim_{n \to \infty} \left( \sum_{k=1}^n \log {\Large |} x^2 - a^2 k^2/n^2 {\Large |} - \sum_{k=1}^n \log ( 1+a^2 k^2/n^2 ) \right) = - \infty.$$

The sums look like Riemann sums with spacing $a/n$, so the quantity in the limit is approximately $$\frac{n}{a} \cdot \left( \int_{t=-a}^a \log| x^2 - t^2| dt - \int_{t=-a}^a \log( 1+ t^2) dt \right).$$ More precisely, we can justify turning sums into integrals as long as $x$ isn't too close to one of the nodes: For example, if we take a sequence of points $x_n$ approaching $x \in [-a,a]$, with $x_n$ always halfway between two nodes, this is valid.

So the limit will be $\pm \infty$ according to the sign of $$\int_{t=-a}^a \log |x^2 - t^2| dt - \int_{t=-a}^a \log (1+t^2) dt.$$ The integrals can be done in closed form, but I'll close by just plotting the graph for $a=5$: enter image description here

As you can see, the curve crosses the axis around $3.8$, which is exactly where the Lagrange polynomials stop converging.