Why Bernstein Polynomials of ?x Are Increasing

approximation-theoryca.classical-analysis-and-odesconvexityfa.functional-analysispolynomials

Bernstein polynomials preserves nicely several global properties of the function to be approximated: if e.g. $f:[0,1]\to\mathbb R$ is non-negative, or monotone, or convex; or if it has, say, non-negative $17$-th derivative, on $[0,1]$, it is easy to see that the same holds true for the polynomials $B_nf$. In particular, since all $B_n$ fix all affine functions, if $f\le ax+b$, also $B_nf(x)\le ax+b$, whence it follows immediately $B_nf\le f$ for concave $f$. On the other hand, comparing $B_nf$ and $B_{n+1}f$ turns out to be harder, due to the different choice of nodes where $f$ is evaluated. Consider for instance the Bernstein polynomials of the function $\sqrt{x}$,
$$p_n(x):=\sum_{k=0}^n{n\choose k}\sqrt{\frac kn}\,x^k(1-x)^{n-k}.$$

Question: Is this sequence of polynomial increasing? More generally, when is $B_nf$ increasing?

Some tentative approaches and remarks.

1. To compare $p_{n+1}$ with $p_n$ we may write the binomial coefficients in the expression for $p_{n+1}(x)$ as ${n+1\choose k}={n\choose k}+{n\choose k-1}$; splitting correspondingly the sum into two sums, and shifting the index in the latter,
we finally get
$$p_{n+1}(x)-p_n(x)=\sum_{k=0}^n{n\choose k}\bigg[x\sqrt{\frac{k+1}{n+1}}+(1-x)\sqrt{\frac k{n+1}}-\sqrt{\frac kn}\,\bigg]x^k(1-x)^{n-k},$$
which at least has non-negative terms approximatively for $\frac kn<x$, which is still not decisive.

2. Monotonicity of the sequence $B_nf$ is somehow reminiscent of that of the real exponential sequence $\big(1+\frac xn\big)^n$. Precisely, let $\delta_n:f\mapsto \frac{f(\cdot+\frac1n)-f(\cdot)}{\frac1n}$ denote the discrete difference operator, and $e_0:f\mapsto f(0)$ the evaluation at $0$. Then the Bernstein operator $f\mapsto B_nf$ can be written as $B_n=e_0\displaystyle \Big({\bf 1} + \frac{x\delta_n}n\Big)^n$ (which, at least for analytic functions, converges to the Taylor series $e^{xD}$ at $0$). Unfortunately, the analogy seems to stop here.

3. The picture below shows the graphs of $\sqrt x$ and of the first ten $p_n(x)$.
(The convergence is somehow slow; indeed it is $O(n^{-1/4})$, as per Kac' general estimate $O(n^{-\frac \alpha2})$ for $\alpha$-Hölder functions). The picture leaves some doubts about the endpoints; yet there should be no surprise, since $p_n(0)=0$, $p_n'(0)=\sqrt{n}\uparrow+\infty$, $p_n(1)=1$, $p_n'(1)=\frac1{1+\sqrt{1-\frac1n}}\downarrow\frac12$.

1

Best Answer

As noted by Paata Ivanishvili, if $f$ is concave on $[0,1]$, then the Bernstein polynomials $B_n(f,p)$ are increasing in $n$. Here is a probabilistic proof:

Let $I_j$ for $j \ge 1$ be independent variables taking value 1 with probability $p$ and $0$ with probability $1-p$. Then $X_n:=\sum_{j=1}^n I_j$ has a binomial Bin$(n,p)$ distribution and the Bernstein polynomial can be written as $B_n(f,p)=E[f(X_n/n)]$. Now for every $j \in [1,n+1]$, the random variable $Y_j=Y_j(n)=X_{n+1}-I_j$ also has a Bin$(n,p)$ distribution and $$ {X_{n+1}} = {\sum_{j=1}^{n+1} (Y_j/n)} \, .$$ For concave $f$, Jensen's inequality gives $$ f \left(\frac{\sum_{j=1}^{n+1} (Y_j/n)}{n+1} \right) \ge \left(\frac{\sum_{j=1}^{n+1} f(Y_j/n)}{n+1} \right) $$ whence $$B_{n+1}(f,p)=E f \left(\frac{X_{n+1}}{n+1}\right)=E f \left(\frac{\sum_{j=1}^{n+1} (Y_j/n)}{n+1} \right) \ge E \left(\frac{\sum_{j=1}^{n+1} f(Y_j/n)}{n+1} \right) =B_n(f,p) $$

Related Question