Measure Theory – Approximating Bounded Measurable Functions with Continuous Functions

analysismeasure-theory

This is not homework. I was reading a paper where the authors showed a result for all continuous functions and then just proceeded to write "the usual limiting Argument gives the result for all bounded functions" – so I am asking myself what this "usual limiting argument" might be. I do not know whether they mean uniform or pointwise convergence. As I see it pointwise convergence should suffice 😀

Thus I was am wondering whether there is a theorem having or leading to the following statement:

Let $K\subset\mathbb{R}^2$ be compact. Any bounded measurable function $f:K\to\mathbb{R} $ can be approximated by a sequence of continuous functions $(g_m)$ on $K$.

Nate Eldredge suggested that I post some excerpt from the original to provide more context for the problem. Here I go:

The goal is to proof the existence of a weak limit for a tight sequence of probability measures on $\mathcal{C}^0([0,1]^2,\mathbb{R})$ associated with reflecting Brownian Motions on the compact the set $[0,1]^2$ which is a Lipshitz Domain. Thus we already, know that some weak limit must exist and it remains to show that two limit-Points agree. Weak-Convergence is generally defined via bounded measurable functions. Now let $P'$ and $P''$ be two subsequential limit points. The authors show that $f \in \mathcal{C}^0([0,1]^2,\mathbb{R})$ the following holds
(here $X_s$ the canonical process)

$E'f(X_s)=E''f(X_s)$

And now comes the actual source of my question:
"The usual limiting argument gives the result for bounded $f$ and hence the one-dimensional distributions agree." (the second part I understand only the "standard limiting argument thing" is somewhat confusing)

Any Help is much appreciated and Thanks in Advance 😀

Best Answer

To respond to your comment on Byron's answer:

The functional monotone class theorem is a very useful result and well worth knowing. However, you can also get this result with arguments that may be more familiar. To recap, we want to show:

Suppose $\mu', \mu''$ are two probability measures on $\mathbb{R}$, and we have $\int f\,d\mu' = \int f\,d\mu''$ for all bounded continuous $f$. Then $\mu' = \mu''$.

One could proceed as follows:

Exercise. For any open interval $(a,b)$, there is a sequence of nonnegative bounded continuous functions $f_n$ such that $f_n \uparrow 1_{(a,b)}$ pointwise.

(For example, some trapezoidal-shaped functions would work.)

If $f_n$ is such a sequence, we have $\int f_n \,d\mu' = \int f_n \,d\mu''$ for each $n$. By monotone convergence, the left side converges to $\int 1_{(a,b)}\,d\mu' = \mu'((a,b))$ and the right side converges to $\mu''((a,b))$. So $\mu'((a,b)) = \mu''((a,b))$, and this holds for any interval $(a,b)$.

Now you can use Dynkin's $\pi$-$\lambda$ lemma, once you show:

Exercise. The collection $$\mathcal{L} := \{B \in \mathcal{B}_\mathbb{R} : \mu'(B) = \mu''(B)\}$$ is a $\lambda$-system. (Here $\mathcal{B}_{\mathbb{R}}$ is the Borel $\sigma$-algebra on $\mathbb{R}$.)

We just showed that the open intervals are contained in $\mathcal{L}$. But the open intervals are a $\pi$-system which generates $\mathcal{B}_{\mathbb{R}}$. So by Dynkin's lemma, $\mathcal{B}_\mathbb{R} \subset \mathcal{L}$, which is to say $\mu' = \mu''$.