[Math] If $g$ is continuous and $g(1)=0$, then $x^ng(x)$ converges on $[0,1)$

continuityconvergence-divergencelimitsreal-analysissequences-and-series

Suppose $g:[0,1]\to\mathbb R$ is a continuous function satisfying $g(1)=0$. Prove that the functions $f_n(x)=x^ng(x)$ converge uniformly on $[0,1]$. Hence or using Mean Value Theorem, prove that if $\int_0^1x^ng(x)dx=0$ for every $n\in\mathbb N$ then $g$ is identically $0$.

First of all, I don't think the convergence is uniform. The pointwise limit is $0$. Now consider $\sup_{x\in[0,1)}x^n|g(x)|=c_n^n|g(c_n)|$ for some $c_n$ which depends on $n$. The supremum is attained as the function $x^n|g(x)|$ is continuous. Now, as I vary $n$, $c_n$ varies and thus, we cannot conclude anything I think.

Secondly, I don't understand how the first part can imply the second part of the question. I obtained the second part using Bernstein Polynomials (Weierstrauss Approximation Theorem) using the fact that these polynomials are uniformly convergent to $g$. I tried using Mean Value Theorem but could not proceed further.

But this lands in a new question:

Suppose $f$ is a continuous non-zero function on a closed bounded interval $[a,b]$. Then is it true that $f$ cannot have infinitely many roots in $[a,b]$?

If I know the answer to this question, I can answer the second part of the original question. The reasoning being that for every $n$, using Mean Value Theorem, there exists $c_n\in[0,1]$ such that $f(c_n)\int_0^1x^ndx=0$ giving $f(c_n)=0$ for all $n$. I want to show this is a contradiction if $f$ is non-zero.

Best Answer

For the first part: let $\epsilon > 0$. As $g$ is continuous there is $1 > \delta > 0$ such as $$x > 1-\delta \implies |g(x)| < \frac\epsilon2 $$

As $g$ is bounded by some $M$, $$ |x^n g(x)| \le \sup_{x \le 1-\delta} |x^n g(x)|+ \sup_{x < 1-\delta} |x^n g(x)| \le M(1-\delta)^n + \frac\epsilon2 \le \epsilon $$for $n > \frac{\log M - \log \epsilon }{\log (1-\delta)}$ big enough.

Related Question