$|f'(x)| \le g(x)$ implies $|f(b) – f(a)| \le \int_a^b g(x) dx$, without assuming $f’$ to be integrable.

integrationreal-analysisreference-request

In a recent answer I needed an argument of the following kind:

Let $f, g: [a, b] \to \Bbb R$ be functions with the following
properties:

  • $f$ is differentiable,
  • $g$ is continuous,
  • $|f'(x)| \le g(x)$ for all $x \in [a, b]$.

Then $|f(b) – f(a)| \le \int_a^b g(x) \, dx$.

Seems pretty easy: We have
$$ \tag{*}
|f(b) – f(a)| = \left| \int_a^b f'(t) \, dt \right|
\le \int_a^b |f'(x)| \, dx \le \int_a^b g(x) \, dx \, .
$$

There is just one problem: We have used the fundamental theorem of calculus, and that requires $f'$ to be Riemann integrable (or Lebesgue integrable and absolutely continuous), compare Necessity of a hypothesis in the fundamental theorem of calculus. This is for example satisfied if $f'$ is continuous.

But the above statement holds without additional assumptions on $f'$: Let $n$ be a positive integer and $x_k = a + \frac kn (b-a)$, $0 \le k \le n$, be a partition of the interval $[a, b]$. We apply the mean-value theorem to each subinterval $[x_k, x_{k+1}]$:
$$
f(x_{k+1}) – f(x_k) = (x_{k+1} – x_k) f'(c_{n, k})
$$

for some $c_{n, k} \in [x_k, x_{k+1}]$. It follows that
$$
|f(b) – f(a)| \le \sum_{k=0}^{n-1}| f(x_{k+1}) – f(x_k)| \le \sum_{k=0}^{n-1} (x_{k+1} – x_k) g(c_{n, k}) \, .
$$

$g$ is continuous and therefore Riemann integrable. The expression on the right is a Riemann sum for $\int_a^b g(x) \, dx$ with the partition size $(b-a)/n$. It follows that
$$
\lim_{n \to \infty } \sum_{k=0}^{n-1} (x_{k+1} – x_k) g(c_{n, k}) = \int_a^b g(x) \, dx
$$

and therefore $|f(b) – f(a)| \le \int_a^b g(x) \, dx$.

My question: Is there a simpler proof of the statement?

It seems pretty basic, therefore I wonder if there is a simpler proof, without going into technical details such as partitions and Riemann sums. Perhaps it is a consequence of some other theorem about integration which I failed to find?

Best Answer

For $\epsilon \in \{1, -1\}$, define

$$ k_{\epsilon}(x) = \int_{a}^{x} g(t) \, \mathrm{d}t - \epsilon(f(x) - f(a)). $$

Then

$$ k_{\epsilon}(a) = 0 \qquad \text{and} \qquad k_{\epsilon}'(x) = g(x) - \epsilon f'(x) \geq 0 , $$

and so, we have $k_{\epsilon}(x) \geq 0$ for all $x \in [a, b]$ and $\epsilon \in \{1, -1\}$. This proves the desired claim.