How do we need to apply the martingale convergence theorem here

lebesgue-integralmartingalesmeasure-theoryprobability theoryprobability-limit-theorems

Let

  • $(E,\mathcal E,\mu)$ be a measure space
  • $E_0\in\mathcal E$ with $\mu(E_0)\in(0,\infty)$
  • $n\in\mathbb N$
  • $B_1,\ldots,B_n\subseteq\left.\mathcal E\right|_{E_0}:=\left\{B\cap E_0:B\in\mathcal E\right\}$ be disjoint with $$\biguplus_{i=1}^n=E_0\tag1$$
  • $f:E\to[0,\infty)$ be $\mathcal E$-measurable with $$c:=\int f\:{\rm d}\mu\in(0,\infty)$$ and $$\nu:=\frac1cf\mu$$

I would like to show that $$\frac1n\sum_{i=1}^n\frac1{\mu(B_i)}\int_{B_i}f\:{\rm d}\mu\xrightarrow{n\to\infty}\frac1{\mu(E_0)}\int_{E_0}f\:{\rm d}\mu.\tag2$$

As kimchi lover noted, there might be a probabilistic proof available utilizing the martingale convergence theorem. Obviously, $$\operatorname E_\nu\left[f\mid B\right]=\frac1{\mu(B_i)}\int_{B_i}f\:{\rm d}\mu\;\;\;\text{for all }B\in\mathcal E.\tag3$$ Now, I guess we consider the filtration $\mathcal F_1,\ldots,F_n$ where $\mathcal F_i$ is the $\sigma$-algebra on $E$ generated by $B_1\uplus\cdots\uplus B_i$. We should obtain $$\operatorname E_\nu\left[f\mid\mathcal F_j\right]=\operatorname E_\nu\left[f\mid B_i\right]\;\;\;\text{almost surely on}B_i\tag4$$ for all $1\le i\le j\le n$.

However, I'm still struggling how we can conclude.

Best Answer

This is not true in general.

Notation: for each $n$, we make a partition $\{B_n^j\}_{j=1}^m $ of $E_0$ into $m$ cells, where $B_n^j$ is the jth cell in the nth partition.

Let $E_0=[0,1]$ and let $\mu$ be the Lebesgue measure on $\mathbb{R}$. Let $f$ be the indicator function for $[1/2,1]$ (or a smooth approximation thereof). At stage $n$, let $B_n^m = [1/2,1]$ and let $B_n^1$ through $B_n^{m-1}$ be some partition of $[0,1/2)$ where each cell has positive measure. Then, $\mu(E_0)^{-1} \int_{E_0} f d\mu = 1/2$, but $$\frac{1}{m} \sum_{j=1}^m \frac{1}{\mu(B_n^j)}\int_{B_n^j}fd\mu=\frac{1}{m}\frac{1}{\mu(B_n^m)}\int_{B_n^m}fd\mu=\frac{1}{m}\longrightarrow 0.$$

Addendum: of course, the reason why you might expect your statement to hold is our intuition from when we divide $E_0$ into cells which are all equal size. But in this case the result holds true at every stage (not just in the limit), since $\mu(B_n^j)=\frac{\mu(E_0)}{m}$ and a convenient cancellation occurs.

However! There is a very similar problem, basically the dual to your stated problem, where you can use martingale techniques. If you have a filtration of partitions (say, countable at each stage) that get coarser and coarser so that in in the limit the partition is just ${\{\emptyset,E_0\}}$, then the conditional expectations with respect to this filtration form a reverse martingale, and one can apply Lévy's Downwards Theorem (14.4 in Williams' Probability with Martingales) to show that this sequence of conditional expectations converges in the limit to the average on $E_0$ (pointwise almost surely - remember, the conditional expectation is understood as a random variable).

This is not precisely the analogue of your question, where you take an unweighted average of the conditional expectations in each cell, but it is nearby enough that I thought it would be worth mentioning.

Related Question