Functional Analysis – Does the Derivative of a Continuous Function Go to Zero if the Function Converges?

derivativesfunctional-analysislimits

Physicist here. I am puzzled by a question: looking at a continuous function $g :\mathbb{R} \rightarrow \mathbb{R}$ that goes to zero at infinity, I am interested in the behavior of its derivative $\beta = g'$. Precisely, does it go to zero too? By writing on paper it looks like:
$$ \beta(+ \infty)=\text{lim}_{x\rightarrow \infty}\text{lim}_{h\rightarrow 0} \frac{g(x+h)-g(x)}{h}$$
And if I can invert the two limits, I get what I expect: $\beta(+\infty)=0$, so I was curious about the hypothesis behind this permutation. Is the requirement of continuity enough?
Thanks.

Best Answer

No, a counterexample is $$ \begin{eqnarray} g(x) &=& \frac{\sin(x^2)}{x} \text{,} \\ g'(x) &=& 2\cos(x^2) - \frac{\sin(x^2)}{x^2} \end{eqnarray} $$ $g$ obviously goes to zero as $x \to \infty$ but $g'(x)$ doesn't. In general, you need uniform convergence of the inner limit with respect to the outer one to swap the two. In the case of this counterexample, you don't have that.

(Technically, my counterexample is continuous only on $\mathbb{R}\setminus\{0\}$, but since you're interested only in $g$'s asymptotic behaviour, that doesn't really matter. You can obviously make it continuous on the whole real line by adjusting it on some interval around $0$, which won't change the asymptotic behaviour at all)

Related Question