Measure Theory – Why Does the Monotone Convergence Theorem Not Apply on Riemann Integrals?

examples-counterexamplesmeasure-theoryriemann-integration

I had just learned in measure theory class about the monotone convergence theorem in this version:

For every monotonically increasing sequence of functions $f_n$ from measurable space $X$ to $[0, \infty]$,
$$
\text{if}\quad
\lim_{n\to \infty}f_n = f,
\quad\text{then}\quad
\lim_{n\to \infty}\int f_n \, \mathrm{d}\mu = \int f \,\mathrm{d}\mu
.
$$

I tried to find out why this theorem apply only for a Lebesgue integral, but I didn't find a counter example for Riemann integrals, so I would appreciate your help.

(I guess that $f$ might not be integrable in some cases, but I want a concrete example.)

Best Answer

Riemann integrable functions (on a compact interval) are also Lebesgue integrable and the two integrals coincide. So the theorem is surely valid for Riemann integrals also.

However the pointwise increasing limit of a sequence of Riemann integrable functions need not be Riemann integrable. Let $(r_n)$ be an ennumeration of the rationals in $[0,1]$, and let $f_n$ be as follows:

$$f_n(x) = \begin{cases} 1 & \text{if $x \in \{ r_0, r_1, \dots, r_{n-1} \}$} \\ 0 & \text{if $x \in \{ r_n, r_{n+1}, \dots \}$} \\ 0 & \text{if $x$ is irrational} \\ \end{cases}$$

Then the limit function is nowhere continuous, hence not Riemann integrable.

Related Question