Chebyshev Polynomials – Intuitive Explanation of Extremal Property

oc.optimization-and-controlpolynomials

Consider the following optimization problem:

Problem: find a monic polynomial $p(x)$ of degree $n$ which minimizes $\max_{x \in [-1,1]} |p(x)|$.

The solution is given by Chebyshev polynomials:

Theorem: Let $T_n(x) = cos (n \cdot cos^{-1} x)$. Then $(1/2^{n-1}) T_n$ is a monic polynomial of degree $n$ which achieves the above minimum.

The proof of this fact is short and surprisingly free of messy calculations. From the definition of $T_n$ you derive a recurrence relation expressing $T_n$ in terms of $T_{n-1}, T_{n-2}$, which shows that $T_n$ are indeed polynomials. Then
you argue that $(1/2^{n-1}) T_n$ is monic and achieves its extrema $\pm 1/2^{n-1}$ at least $n+1$ times in $[-1,1]$, from which the above theorem easily follows. If you'd like a nice exposition of this argument which does not skip any steps, this is short and clear.

However, I don't get much enlightenment from this proof: it feels pulled out of a
hat. For example, it gives me no clue about which other polynomial optimization problems
have similar solutions.

My question: Is there a natural and motivated sequence of steps which, starting from the above optimization problem, leads to Chebyshev polynomials?

Update: I changed the title to better reflect the question I am asking.

Best Answer

Well, let's try to avoid the hat.

Consider the dual (and obviously equivalent) problem: find the polynomial $p(x):[-1,1]\rightarrow [-1,1]$ of degree $n$ with the greatest possible leading coefficient. We have some information on values of $p$, and need something about its coefficient. Let's try Lagrange's interpolation. Take some $n+1$ values $t_1 < t_2 < \dots < t_{n+1}$ from $[-1,1]$ and write down (for $u(x)=(x-t_1)\dots(x-t_{n+1})$) the formula $$ p(x)=\sum p(t_i) \frac{u(x)/(x-t_i)}{u'(t_i)}. $$ Then take a look on coefficient of $x^n$. It equals $$ \sum \frac{p(t_i)}{u'(t_i)}. $$ We know that $|p(t_i)|\leq 1$, so the leading coefficient does not exceed $ \sum 1/|u'(t_i)|. $ Ok, when does equality occur? The answer is: $p$ should take values $(-1)^{n-i+1}$ in $t_i$. That is, we have to find a polynomial of degree $n$ with $n+1$ extremal values $\pm 1$ on $[-1,1]$. This may hold only if $t_1=-1$, $t_{n+1}=1$, and $t_2$, $\dots$, $t_n$ are roots of $p'$. So, $1-p^2(x)$ should be divisible by $(1-x^2)p'(x)$. Hereon the trigonometic substitution $x=\cos t$, $p=\cos f$ is very natural, as we know that $1-f^2$ is divisible by $f'$ for $f=\cos$. So we invent Chebyshev's polynomials.

Also, it is seen from Lagrange formula that they are extremal in many other problems with restrictions $|p(x)|\leq 1$ on $[-1,1]$. For example, the value in each specific point $x_0>1$ is maximized also for Chebyshev polynomial, it is proved by exactly the same way.

Related Question