Let $(Y_n)_{n \in \mathbb{N}}$ be a sequence of independent random variables such that
$$\mathbb{P}(Y_n = 1) = \mathbb{P}(Y_n=-1) = \frac{1}{2n} \qquad \mathbb{P}(Y_n=0) = 1- \frac{1}{n}.$$
If we define
$$X_n := \begin{cases} Y_n, & X_{n-1} = 0, \\ n X_{n-1} |Y_n|, & X_{n-1} \neq 0 \end{cases} \qquad X_0 := 0$$
then the process $(X_n)_{n \in \mathbb{N}_0}$ is a martingale with respect to $\mathcal{F}_n := \sigma(Y_k; k \leq n)$. Indeed:
$$\begin{align*} \mathbb{E}(X_n \mid \mathcal{F}_{n-1}) &= 1_{\{X_{n-1}=0\}} \underbrace{\mathbb{E}(Y_n \mid \mathcal{F}_{n-1})}_{=\mathbb{E}(Y_n)=0} + n 1_{\{X_{n-1} \neq 0\}} X_{n-1} \underbrace{\mathbb{E}(|Y_n| \mid \mathcal{F}_{n-1})}_{=\mathbb{E}(|Y_n|) = 1/n} \\ &= 0 \cdot 1_{\{X_{n-1}=0\}} + 1_{\{X_{n-1} \neq 0\}} X_{n-1} = X_{n-1}. \end{align*}$$
For any fixed $a \in \{-1,0,1\}$ we have
$$\begin{align*} \sum_{n \geq 1} \mathbb{P}(Y_{2n}=0, Y_{2n+1}=a) &= \sum_{n \geq 1} \mathbb{P}(Y_{2n}=0) \mathbb{P}(Y_{2n+1}=a) \\ &\geq \sum_{n \geq 1} \left(1-\frac{1}{2n} \right) \frac{1}{2(2n+1)} = \infty, \end{align*}$$
and therefore the Borel-Cantelli lemma shows that for almost all $\omega$ it happens for infinitely many $n \in \mathbb{N}$ that $Y_{2n}(\omega)=0$, $Y_{2n+1}(\omega)=a$. By the very definition, this implies that $X_{2n}(\omega)=0$ and $$X_{2n+1}(\omega)=Y_{2n+1}(\omega)=a$$ for any such $n \in \mathbb{N}$. Consequently, we have shown that $$\mathbb{P}(X_k = a \, \, \text{infinitely often})=1$$ for any $a \in \{-1,0,1\}$. It remains to prove that $$\sup_{n \in \mathbb{N}} |X_n(\omega)| < \infty \quad \text{a.s.}$$ To this end, we note that $$\sum_{n \geq 1} \mathbb{P}(Y_n \neq 0, Y_{n+1} \neq 0) = \sum_{n \geq 1} \mathbb{P}(Y_n \neq 0) \mathbb{P}(Y_{n+1} \neq 0) \leq \sum_{n \geq 1} \frac{1}{n^2} < \infty,$$ applying the Borel-Cantelli lemma we find that for almost all $\omega$ we can choose $N=N(\omega)$ such that $$Y_{n}(\omega) \neq 0 \implies Y_{n+1}(\omega)=0 \quad \text{for all $n \geq N$.}$$ As $$X_n(\omega) \neq 0 \implies Y_n(\omega) \neq 0 \quad \text{and} \quad Y_{n+1}(\omega) = 0 \implies X_{n+1}(\omega)=0$$ this means that $$X_n(\omega) \neq 0 \implies X_{n+1}(\omega)=0 \quad \text{for all $n \geq N$.}$$ By the definition of $X_n$, this implies that $|X_n(\omega)| \leq |Y_n(\omega)| \leq 1$ for all $n \geq N$. Thus, $$\sup_{n \in \mathbb{N}} |X_n(\omega)| \leq \sup_{n \leq N} |X_n(\omega)| + 1<\infty.$$
You need to verify the following properties:
- $X_S$ (resp. $X_T$) is measurable w.r.t to $\mathcal{F}_S$ (resp. $\mathcal{F}_T$),
- $X_T$ is integrable,
- $\mathbb{E}(X_T \mid \mathcal{F}_S) = X_S$.
Hints: Since $S \leq T$ are bounded stopping times, there exists some $N \in \mathbb{N}$ such that $S \leq T \leq N$.
- To prove measurability, you need to verify that $$\{X_S \in B\} \cap \{S \leq n\} \in \mathcal{F}_n$$ for all $n \in \mathbb{N}_0$ and Borel sets $B$. To this end, write $$\{X_S \in B\} \cap \{S \leq n\} = \bigcup_{k=0}^n \{X_k \in B\} \cap \{S=k\}.$$ (Clearly, an analogous statement holds for $T$.)
- To prove integrability, use that $$X_T = \sum_{k=0}^N X_k 1_{\{T=k\}}$$ and the fact that $\mathbb{E}(|X_k|)<\infty$ for each $k$.
It remains to prove the third property (i.e. to compute the conditional expectation). We will use the following statement:
Lemma: Let $(M_n)_{n \in \mathbb{N}}$ be a martingale. If $T$ a bounded stopping time, then $\mathbb{E}(M_T) = \mathbb{E}(M_0)$.
- Use the Doob decomposition of the submartingale $(X_n)_{n \in \mathbb{N}}$ and the above lemma to show that $\mathbb{E}(X_T) \geq \mathbb{E}(X_S)$.
- Fix $F \in \mathcal{F}_S \subseteq \mathcal{F}_T$. Show that $\varrho := S 1_F + T 1_{F^c}$ defines a stopping time satisfying $\varrho \leq T$. Apply the previous step to the stopping times $\varrho$, $T$ to obtain that $$\mathbb{E}(X_\varrho) = \mathbb{E}(X_T).$$ Rearrange the terms on both sides to conclude that $$\mathbb{E}(X_S 1_F) = \mathbb{E}(X_T 1_F).$$ Since $F \in \mathcal{F}_S$ is arbitrary and $X_S$ is $\mathcal{F}_S$-measurable, this proves $$\mathbb{E}(X_T \mid \mathcal{F}_S) = X_S.$$
Best Answer
It's the second statement. Normally, when you use the word "bounded" in reference to a family of objects (here, the sequence of random variables that make up the martingale), you mean that there is a uniform bound that applies to all of them.
In particular, this assumption is used in the proof to justify the application of the bounded convergence theorem to (19). If only your first statement held, there would be no way to justify that step.
Indeed, Corollary 66 becomes false under your first interpretation. For a counterexample, suppose $X_n$ were simple random walk started at 0, which is a martingale. Note that each $X_n$ is bounded; indeed, $|X_n| \le n$. Let $S=0$, and let $T$ be the first time when the random walk reaches $-1$ (which is almost surely finite; standard fact). Then $(X_S, X_T)$ is certainly not a submartingale because $X_S = 0$ and $X_T = -1$.