[Math] Martingale and Stopping Time with Finite Expectation

brownian motionmartingalesprobability theorystochastic-processesstopping-times

Let $(M_t, \mathcal{F}_t, 0 \leq t < \infty)$ be a martingale. For bounded stopping time $T$, we can deduce from Doob's Optional Sampling that $\mathbb E(M_T)=\mathbb E(M_0)$. Now let $T$ be a stopping time with finite expectation, i.e. $\mathbb E(T)<+\infty$. Can we deduce using $T\wedge n$ and perhaps Lebesgue's Dominated Convergence Theorem that $\mathbb E(M_T)=\mathbb E(M_0)$? If not, is there a sufficient condition for this to be true?

This question arose in studying Brownian motion. For $\tau:=\inf\{t>0:B_0=a,\,B_t=-b\}$ stopping time, we wish to show $\mathbb E(\tau)=ab$. One solution suggests $\mathbb E(B_\tau^2-\tau)=0$ by Doob's Optional Sampling Theorem. I wish to justify this claim with the above.

Best Answer

To see that $E[T] < \infty$ isn't sufficient in general, a discrete-time counterexample is the "doubling martingale". Let $\xi_i$ be iid Rademacher (i.e. taking the values $\pm 1$ with probability 1/2) and let $M_n = \sum_{i=1}^n 2^i \xi_i$, with $M_0 = 0$. (Imagine betting on fair coin flips, where you double your wager on every round.) It's easy to see $M_n$ is a martingale. Let $T = \inf\{i : \xi_i = 1\}$ be the first time that heads is flipped. Then $T$ is a stopping time and we have $M_T = \sum_{i=1}^{T-1} (-1) \cdot 2^i + 2^T = 2$; as soon as you flip a heads, you win back everything you have lost, plus 2 dollars. So the optional sampling theorem fails for $T$. Yet $T$ has a geometric distribution with success probability $1/2$ and one can easily compute that $E[T]=2$.

I haven't checked, but I think one could use similar constructions to show that no "light tails" condition on $T$ could suffice, short of requiring $T$ to actually be bounded.