[Math] Singular functions

analysis

I will start with a definition.

A monotone function $f$ on $[a,b]$ is called singular if $f'=0$ almost everywhere.

Let $f$ be a nondecreasing function on $[a,b]$ such that given $\epsilon~,~\delta\gt 0$, $\exists$ a finite collection $\{[y_k,x_k]\}$ of nonoverlapping intervals such that $$\sum |x_k-y_k|\lt \delta~~~~~\text{and}~~~~\sum\left(f(x_k)-f(y_k)\right)\gt f(b)-f(a)-\epsilon.$$
I would like to show that $f$ is singular.

Attempt:

From a previous exercise, I showed that a monotone function $f$ can be written as the sum of an absolutely continuous function, $g$ and a singular function, $h$. Thus $$ f = g + h ,~~~\text{where}~~g=\int_a^x f' .$$
My goal is to show that $g=0$ almost everywhere. Let $I=\bigcup (y_k,x_k).$ Then
$$ \int_I f' = \sum \int_{(x_k,y_k)}~f' = \sum\left(f(x_k)-f(y_k)\right)\lt \epsilon.$$
Now, I know that since $f'$ is integrable, there is an $\epsilon$ such that $$\int_{[a,b]\setminus I}~f'\lt \epsilon. $$ But $$\begin{align*}
0\leq \int_a^b f' & = \int_I f'+\int_{[a,b]\setminus I}~f'\\
& \lt \epsilon + \epsilon\\
& = 2\epsilon.
\end{align*}$$
Since $\epsilon$ is arbitrary, we can let $\epsilon \rightarrow 0$ and thus $$ \int_a^b f'=0 ~~\text{and }~~g = 0.$$

Is what I've done right? Is there another way of approaching the problem?
Thanks.

Best Answer

Given $\varepsilon>0$, there is a $\delta>0$ such that $\int_I f'<\varepsilon$ whenever $m(I)<\delta$. Thus you can choose $I=\cup(x_k,y_k)$ such that $\int_I f'<\varepsilon$ and $\sum_k f(y_k)-f(x_k)>f(b)-f(a)-\varepsilon$.

This implies that

$\begin{align*} &f(b)-f(a)-(h(b)-h(a))\\ &\leq f(b)-f(a) - (\sum_k h(y_k)-h(x_k))\\ &<\sum_k(g(y_k)-g(x_k))+\varepsilon\\ &=\int_I f' +\varepsilon\\ &<2\varepsilon. \end{align*}$

Since $\varepsilon$ was arbitrary, this implies that $f(b)-f(a)=h(b)-h(a)$, so $g(b)-g(a)=0$, which in turn implies that $g=0$ everywhere. (Note how monotonicity is used here several times.)

Related Question