Why is the supremum of the sequence of running maximum of cadlag processes also cadlag

analysismeasure-theoryreal-analysisstochastic-calculusstochastic-processes

I have a question while reading the proof of Theorem 6 from the following post: https://almostsuremath.com/2009/12/24/local-martingales/

So here we assume $X^n$ to be a sequence of local submartingales, i.e. processes that are locally a cadlag adapted submartingale. Also, we know from the assumption that (passing to a subsequence), $X^n$ converges to $X$ uniformly on compacts. So we define $$M_t := \sup_n \sup_{s \le t} |X_t^n|.$$ My questions are, why is this process cadlag, and the jumps $|\Delta M| \le \sup_n |\Delta X^n|$?

I know that the running maximum $\sup_{s \le t} |X_t|$ of a cadlag process $X$ is right continuous since it is increasing and it can only have jumps where $X$ has a jump which can be approximated from the right. However, I cannot see why the supremum over $n$ of these sequences will also be right-continuous and have left hand limits.

Moreover, how do the jumps of this process $M$ look like so that they are bounded above by the supremum of the jumps $|\Delta X^n|$?

I would greatly appreciate some help with these details.

Theorem 6 Let $\{X^n\}_{n=1,2,\ldots}$ be a sequence of local martingales (resp. local submartingales, local supermartingales) converging ucp to a limit $_X$. If $$\sup_n \sup_{s \leq t} |\Delta X_s^n|$$ is locally integrable then $_X$ is a local martingale (resp. local submartingale, local supermartingale).

Proof: It is enough to prove the submartingale case, as the martingale and supermartingale cases follow from applying this to $_{-X}$.

First, as it is a ucp limit of cadlag adapted processes, $_X$ will be cadlag and adapted. Passing to a subsequence if necessary, we may suppose that $_{X^n}$ converges to $_X$ uniformly on compacts. Then, $$M_t = \sup_n \sup_{s \leq t}|X_t^n|$$ is cadlag, adapted, and increasing. It has jumps $|\Delta M| \leq \sup_n |\Delta X^n|$ which, by the condition of the theorem, is locally integrable. Therefore, $_M$ is locally integrable. Let $\tau_k$ be a localizing sequence, so that $1_{\{\tau_k > 0\}}M^{\tau_k}$ is integrable. Then, $1_{\{\tau_k > 0\}}(X^n)^{\tau_k}$ are local submartingales bounded by $1_{\{\tau_k\}}M^{\tau_k}$ and, in particular, are of class (DL). So, they are proper submartingales converging to $1_{\{\tau_k > 0\}}X^{\tau_k}$ and, applying bounded convergence to this limit, $1_{\{\tau_k > 0\}}X^{\tau_k}$ is a submartingale. Therefore, $\tau_k$ is a localizing sequence for $_X$, showing that it is a local submartingale. $\square$

Best Answer

We have $$ M_t = \sup_{n \geq 1} \sup_{s \leq t} |X_s^n| $$ We consider the processes $$ M^n_t =\sup_{m \leq n} \sup_{s \leq t} |X^m_s| $$

For $n \geq 1$. We claim that $M^n_t$ are increasing adapted processes.

Indeed, for each $1 \leq m \leq n$, the processes $Y^m_t = \sup_{s \leq t} |X^m_s|$ are increasing (obviously) and adapted. Once you observe this, $M^n_t = \max_{m \leq n} Y^m_t$ is a finite maximum of adapted processes, hence increasing and adapted.

Now, for any (bounded) sequence of real numbers $a_n$, we have $\sup_{n \geq m} a_m \to \sup_{n \geq 1} a_n$. Therefore,$M_t^n \to M_t$ pointwise, which shows that $M_t$ is adapted as well.

Claim : Let $f:\mathbb R^+ \to \mathbb R$ be a right continuous function. Then $g(t) = \sup_{s \leq t} f(s)$ is a cadlag function.

Proof : Note that $g(t)$ is an increasing function, and therefore admits left and right limits at every point. It is therefore sufficient to prove that $g(t)$ is right-continuous.

Let $t_n \downarrow t$, we must prove that $g(t_n) \to g(t)$. Suppose not. Then, for some $\epsilon>0$, $g(t_n) > g(t)+\epsilon$ for all $n$. By the definition of the supremum, there exist $s_n \leq t_n$ so that $$f(s_n) > g(t)+\epsilon \geq f(t) +\epsilon$$ for all $n$.

We claim that $s_n > t$. Indeed, if $s_n \leq t$ then $f(s_n) \leq g(s_n) \leq g(t)$ which contradicts the choice $f(s_n)>g(t)+\epsilon$ made above. Consequently, by the squeeze theorem, $s_n \to t$ and therefore $f(s_n)\to f(t)$ by right-continuity, but we also know that $f(s_n)-f(t) > \epsilon$ for all $n$. This is a contradiction : consequently, $g$ is right continuous. $\blacksquare$

As $X^m_t$ is right-continuous, it follows from the above that $Y^m_t$ is cadlag. Finally, $M^n_t$, for any $n$, is a finite maximum over some of these cadlag functions, and is therefore cadlag itself (to see this, you can use induction, along with the formula $\max\{x,y\} = \frac{x+y+|x-y|}{2}$).

To show that $M_t$ is cadlag, we first note that with respect to $t$, it is an increasing process, because if $s \leq t$ then $M^n_s \leq M^n_t$ for each $n$ and you can take the limit. Therefore, the left and right limit functions $M_{s-}, M_{s+}$ are well-defined for sure, since every increasing function admits right and left limits at each point.

To show right-continuity, we must prove that $M_{s+} = M_s$ almost surely. Suppose not. In that case, on a set of non-zero probability, $M_{s+} \neq M_s$. In particular, for some $\epsilon>0$, the set $\{M_{s+} - M_s > \epsilon\}$ has non-zero probability. We note that $M_{s+} - M_s > \epsilon$ if and only if there is a sequence $s_n \downarrow s$ such that $M_{s_n} > M_s + \epsilon$ for every $n$.

However, by the definition of $M_{s_n}$ and the supremum, there exists $s'_n \leq s_n$ and a sequence of indices $m_n$ such that $$ |X^{m_n}_{s'_n}| > M_s+\frac{\epsilon}{2} \geq |X^{m_n}_{s}| + \frac{\epsilon}{2} \tag{1} $$

for all $n$, which we may rearrange and write as $$ |X^{m_n}_{s'_n}| - |X^{m_n}_{s}| > \frac{\epsilon}{2} \tag{2} $$

However, clearly $s'_n > s$ ,otherwise it would be absorbed under the supremum in the definition of $M_s$ and couldn't therefore admit a value bigger than $M_s$. It follows by the squeeze theorem that $s'_n \to s$. Also note that $$ |X^{m_n}_{s'_n} - X^{m_n}_s| \geq |X^{m_n}_{s'_n}| - |X^{m_n}_{s}| > \frac{\epsilon}{2} \tag{3(a)} $$

and $$ |X^{m_n}_{s'_n} - X^{m_n}_s| \leq |X^{m_n}_{s'_n} - X_{s'_n}| + |X_{s'_n} - X_{s}|+ |X_s - X^{m_n}_s| \tag{3(b)} $$

Claim : $m_n$ isn't a sequence of bounded numbers.

Proof : Suppose that $m_n$ is a bounded sequence of natural numbers. Then, there exists an $N$ such that $m_n \leq N$ for all $n$. Now, we know that $M^N_t$ is a cadlag process a.s. i.e. we know that for the sequence $s'_n$ above, we have $M^N_{s'_n} \to M^N_s$. However, note that $|X^{m_n}_{s'_n}| \leq M^N_{s'_n}$, so we know that $$\limsup_{n \to \infty} |X^{m_n}_{s'_n}| \leq \limsup_{n \to \infty} M^N_{s'_n} = M^N_{s}$$ However, from the first inequality in $(1)$, we get $$ \liminf_{n \to \infty} |X^{m_n}_{s'_n}| \geq M_{s}+\frac{\epsilon}{2} > M_s \geq M^N_{s} \geq \limsup_{n \to \infty} |X^{m_n}_{s'_n}|$$ This inequality above provides a contradiction, and completes the proof.$\blacksquare$

Therefore $m_n$ has a subsequence that converges to infinity. Let that convergent subsequence be $m_n$ itself, for the sake of easy notation.

In that case, $X^{m_n} \to X$ ucp. However, look now at the right hand side of $(3(b))$. The term $|X_{s'_n}-X_s|$ goes to $0$ as $n$ goes to infinity, as $X$ is cadlag. The terms $|X^{m_n}_{s'_n} - X_{s'_n}|$ and $|X^{m_n}_{s} - X_s|$ go to $0$ as $n \to \infty$ by the ucp convergence of $X^{m_n}$ to $X$. Finally, the entire right hand side of $(3(b))$ must go to $0$. However, this contradicts $(3(a))$, since $(3(a)),(3(b))$ combined tell you that the RHS of $(3(b))$ cannot go to $0$.

Therefore , if $(1),(2),(3(a))$ are to hold true on the set where $M_{s+} \neq M_s$, it must happen that either right-continuity or ucp convergence is violated at every element here. However, the intersection of both happens with probability $1$, therefore it is impossible that $M_{s+} \neq M_s$ be a set of non-zero probability.

Finally, $M_t$ must be a cadlag process.


Regarding the jumps of $M_t$, we see that $$ M_{t-} = \sup_{n \geq 1} \sup_{s < t} |X_s^n| $$

Now, suppose that $M_t - M_{t-} =\epsilon> 0$. This means that the supremum in $M_t$ must occur at the time point $t$, so that means there is a sequence $m_n$ such that $|X^{m_n}_t| - M_{t-} > \epsilon-\frac 1n$ for each $n$. Note that $M_{t-} \geq |X^{m_n}_{t-}|$, so this implies via the triangle inequality that $|\Delta X^{m_n}_t| > \epsilon-\frac 1n$ for each $n$, or that $$\sup_{n} |\Delta X^n_t| \geq \epsilon = \Delta M_t = |\Delta M_t|$$ following the application of a limit and supremum on the LHS.