[Math] Prove that this function is differentiable

derivativeslipschitz-functionsreal-analysis

I came across this problem while I was studying for a preliminary exam and now I've devoted quite some time to it and can't figure it out. Any help would be greatly appreciated!

Let $f : \mathbb R \to \mathbb R$ be a Lipschitz function such that for all $x \in \mathbb R$, $$\lim_{n\to\infty} n \left[f\left( x + \tfrac 1 n \right) – f(x)\right] = 0.$$ Prove that $f$ is differentiable.

EDIT: As has been pointed out in the comments, simply making the substitution $h=1/n$ is not sufficient. If we know a priori that $f$ is differentiable at $x \in \mathbb R$ then certainly the above limit gives the derivative at $x$. However, to prove that $f$ is differentiable at $x$, we need to prove the stronger assertion: that the limit $$\lim_{n\to \infty} \frac{f(x + h_n) – f(x)}{h_n}$$ exists for any sequence $\{h_n\}$ with $h_n \to 0$ (not just $h_n = 1/n$).

Addressing a comment: the brackets in the problem statement are not meant to indicate the "floor" function; they are just meant to be parentheses. Sorry for the confusion.

Best Answer

Since $f$ is Lipschitz on $[a,b],$ $f$ is absolutely continuous on $[a,b].$ Thus $f'(x)$ exists a.e. in $[a,b],$ $f'\in L^1[a,b],$ and

$$f(x) = f(a) + \int_a^xf'(t)\,dt, \,\,x\in [a,b].$$

Since the assumption in this problem gives $f'(x) = 0$ wherever $f'(x)$ exists, the above integral is $0$ for all $x\in [a,b].$ Thus $f$ is constant on $[a,b],$ and since $[a,b]$ is arbitrary, we have $f$ constant on $\mathbb R$ as desired.

Related Question