Stochastic Processes – Trajectory Regularity of Conditional Expectation with Additional Randomness

martingalesstochastic-processes

Consider a probability space that support a standard Brownian motion $W=(W_t)$ and a random variable $Z$ that is independent of $W$. Denote by $\mathbb F^W=(\mathcal F^W_t)_t$ the natural filtration generated by $W$ (assumed to satisfy the usual conditions). Let $\xi$ be a $\mathcal F_T-$measurable random variable and $g:\mathbb R^2\to\mathbb R$ be bounded/measurable. Define the process by taking conditional expectation

$$X_t: = \mathbb E[g(\xi, Z) |\mathcal F^W_t],\quad \forall t\in [0,T].$$

Is, up to a modification, $t\mapsto X_t$ continuous almost surely?

Best Answer

Yes. Since $W$ and $Z$ are independent, we may write

$$X_t = \int_{\mathbb R} \mathbb E[g(\xi, z)| \mathcal F^W_t] \, d\mu_Z(z),$$

as can be seen by, say, taking the regular conditional probability with respect to $Z$, or by working directly on the product space $(C[0, T] \times \mathbb R, \mu_W \times \mu_Z, \mathcal B_{C[0, T]} \otimes \mathcal B_{\mathbb R}).$

We recognize that for every $z$, $Y^z_t: = \mathbb E[g(\xi, z)| \mathcal F_t^W]$ is a closable martingale with respect to the Brownian filtration. According to the results here and here, $Y^z_t$ is in fact continuous for every $z$. By the boundedness of $g$, $Y^z_t$ is also uniformly bounded.

Now the rest of the proof is analysis - we claim that $X_t$, being the average of continuous, uniformly bounded functions is also continuous almost surely.

To see this, let

$$\phi(z, \delta, \omega) := \sup_{s, t; |s - t| < \delta} |Y_t^z (\omega) - Y_s^z (\omega)|$$

be a uniform modulus of continuity for $Y^z$, and let $M > 0$ be a uniform bound for $|g|$. By continuity of $Y^z_t$, we have for $\mu_z \times \mathbb P$-a.e. $(z, \omega)$ that $\lim_{\delta\to 0} \phi(z, \delta, \omega) = 0$.

In other words, writing $E_{\varepsilon, \delta, \omega} := \{z \, | \, \phi(z, \delta, \omega) \leq \varepsilon\}$, we have for every $\varepsilon > 0$ that for $\mathbb P$-a.e. $\omega$ we have $\mu_Z(E_{\varepsilon, \delta, \omega}) \to 1$ as $\delta \to 0^+$.

If for each $\varepsilon > 0$, we write $N_{\varepsilon}$ for the $\mathbb P$-null set of exceptions to the above statement, we may set $N := \cup_{n \in \mathbb Z_+} N_{1/n}$ to find that for all $\omega$ in the $\mathbb P$-full measure set $\Omega \setminus N$, we have $\mu_Z(E_{\varepsilon, \delta, \omega}) \to 1$ as $\delta \to 0^+$, for every $\varepsilon > 0$.

Now let $\varepsilon > 0$ be arbitrary, and fix $\omega \in \Omega \setminus N$. Pick $\delta$ such that $\mu_Z (E_{\varepsilon/2, \delta, \omega}) > 1 - \frac{\varepsilon}{2M}$.

We then compute, for all $s, t$ with $|s - t| < \delta$,

$$|X_t (\omega) - X_s (\omega)|$$

$$= |\int_{\mathbb R} Y^z_t (\omega) - Y^z_s (\omega)\, d\mu_Z(z)|$$

$$\leq \int_{\mathbb R}|Y^z_t (\omega) - Y^z_s (\omega)| \, d\mu_Z(z)$$

$$ = \int_{E_{\varepsilon/2, \delta, \omega}} |Y^z_t (\omega) - Y^z_s (\omega)| \, d\mu_Z (z) + \int_{E^c_{\varepsilon/2, \delta, \omega}} \mathbb |Y^z_t (\omega) - Y^z_s (\omega)| \, d\mu_Z(z)$$

$$ < \frac{\varepsilon}{2} + M \frac{\varepsilon}{2M} = \varepsilon$$

and we conclude since $\varepsilon$ was arbitrary.

Related Question