[Math] Martingale and bounded stopping time

martingalesprobability theory

A theorem of submartingale and bounded stopping time says:

Theorem 5.4.1. If $X_n$ is a submartingale and $N$ is a stopping time with $\mathbb P (N \le
k) = 1$ then $\mathbb EX_0 ≤ \mathbb EX_N ≤ \mathbb EX_k$.

An exercise for this theorem is

Example 5.4.1. Random walks. If we let $S_n = \xi_1 + · · · + \xi_n$ where the $ξ_m$
are independent and have $\mathbb E \xi_m = 0$, $\sigma_m^2 = \mathbb E \xi_m^2 < \infty$. Suppose we have that $|\xi_m | \le K$ and let $s^2_n = \sum_{m \le n} \sigma^2_m$. Note that $S_n^2 − s^2_n$ is a martingale. Use this fact and Theorem 5.4.1 to conclude
$$\mathbb P \left(\max_{1 \le m \le n} |S_m| ≤ x \right) ≤ (x + K)^2/ \mathbb E(S_n^2)$$

Let $A = \{\max_{1 \le m \le n} |S_m| ≤ x\}$. Let $X_n = S^2_n – s^2_n$. Let $N = \inf\{m:|S_m| \ge x~\text{or}~n+1\}$. So $N$ is a bounded stopping time. Thus by the previous theorem we have
$$
0 = \mathbb E{X_1} = \mathbb E{X_N} = \mathbb E{X_{n+1}}.
$$
Since $X_{n+1} = X_N$ on $A^c$, we have $\mathbb E (X_{n+1} 1_A) = \mathbb E (X_N 1_A)$. Therefore, as long as we have $\mathbb E (X_N 1_A) \ge 0$, (which I don't know how to prove), we have $\mathbb E (X_{n+1} 1_A) \ge 0$. It follows that
$$
\mathbb E(s_n^2 1_A) \le \mathbb E(s_{n+1}^2 1_A) \le \mathbb E(S_{n+1}^2 1_A) \le (x + K)^2.
$$
But I have problem to show that $\mathbb E(X_N 1_A) \ge 0$. Am I on the right direction?

Best Answer

Since you're curious as to whether or not you're going in the right direction, allow me to address your approach first before giving you my solution (which follows a different approach).

I actually liked what you were trying to do and attempted to rescue it for a while. Then, I realized it might be a little too weak. I don't have a counterexample at hand but I'll share my intuition. You are trying to prove $\mathbb{E}\left\{ (S_{n+1}^2 - s_{n+1}^2) \mathbb{1}_A \right\} \geq 0$, or intuitively that conditional on the partial sums of $S_n$ being small (i.e. event $A$ occurring), your realized variance $S_{n+1}^2$ is in expectation at least as large as the expected (unconditional) variance $s_{n+1}^2$. I just can't imagine why small partial sums would give you large variance. The situation is usually reversed: having large partial sums allows you to say the variance is large (Chebyshev, Kolmogorov, Doob inequalities all work in this context). I don't think there is enough data in the problem to prove a converse inequality.

On to the solution.

First of all, consider the slightly more natural stopping time $\tau := n \wedge \inf \{ m \geq 1 : |S_m| > x \}$. This is different from your $N$ in a couple of ways that make it easier to use. Furthermore, it is clearly a bounded stopping time and such that $S_\tau^2 \leq (x+K)^2$.

In view of $S_n^2 - s_n^2$ being a martingale and $\tau$ a bounded stopping time, the optional stopping theorem (or the variant thereof you refer to as Theorem 5.4.1) gives $\mathbb{E} S_\tau^2 = \mathbb{E} s_\tau^2$. In view of $S_\tau^2 \leq (x+K)^2$, we conclude that

$$ (x+K)^2 \geq \mathbb{E} S_\tau^2 = \mathbb{E} s_\tau^2 = \mathbb{E} \left\{ \mathbb{1}_A s_\tau^2 \right\} + \mathbb{E} \left\{ \mathbb{1}_{A^c} s_\tau^2 \right\} \geq \mathbb{E} \left\{ \mathbb{1}_A s_\tau^2 \right\}$$

where $A = \cap_{1 \leq m \leq n} \{ |S_m| \leq x \}$, as with your approach. Notice that when $A$ occurs we have $\tau = n$ and therefore $s_\tau^2 = s_n^2$, the latter being a constant, so

$$ (x+K)^2 \geq \mathbb{E} \left\{ \mathbb{1}_A s_\tau^2 \right\} = \mathbb{E} \left\{ \mathbb{1}_A s_n^2 \right\} = \mathbb{E} \mathbb{1}_A \cdot s_n^2 = \mathbb{P}(A) \mathbb{E} S_n^2 $$

which is the required result.

Related Question