Solved – Proving a non-stopping time

markov-processstochastic-processes

Let me begin by first confirming that this is indeed the correct place to post this (other ideas I had were math.SE). That said,


Let $X_n$ be a Markov chain on the state space $\mathcal S$ and for $ y \in \mathcal S$ let $T_y = \min\{ n \ge 1 : X_n =y\}$ be the first return time to $y$. Let $W_y = T_y – 1$ be the time just before the first return to $y$

  • Explain why $W_y$ is not a stopping time

  • Show that the Strong Markov Property does not apply to $X_n$ at random time $W_y$.

My Work

When showing that $W_y$ is not a stopping time, is it sufficient to write $$W_y = \bigcap_{i = 1}^{n-1} \{X_i \ne y\} \cap X_n = y$$ and claim that since $X_n$ does not belong to the set $\{X_0, X_1, \dots, X_{n-1}\}$, we have that $W_y$ is not a stopping time?


Then, for showing that the Strong Markov Property does not apply, can I write $$\mathbf{P}(X_n = y \mid W_y = n-1, X_{n-1} = i, X_{n-2} = x_{n-2}, \dots, X_0 = y) = 1 \ne p(i, y)$$ where $p(i,y)$ is the one step transition probability from $i$ to $y$?

Best Answer

First question. Denoting by $({\cal F}_n)$ the filtration generated by the process $(X_n)$ we will prove that the event $A:=\{W_y= 1\}$ does not belong to the $\sigma$-field ${\cal F}_1$. This implies that $W_y$ is not a stopping time. One has $A=\{X_1 \neq y\} \cap \{X_2=y\}$. If $A$ did belong to ${\cal F}_1$ then one would have $\Pr(A \mid {\cal F}_1)={\boldsymbol 1}_A$. But one has $\Pr(A \mid {\cal F}_1) = p(X_1,y){\boldsymbol 1}_{X_1 \neq y}$.

Second question. The process $(X_{W+k})_{k \geq 0}$ is Markovian but does not share the same Markov transition as the process $(X_n)$. Indeed, putting $Y_k=X_{W+k}$, the conditional distribution of $Y_1$ given $Y_0$ is the Dirac distribution at $y$. Hence the strong Markov property of $(X_n)$ does not apply for the random time $W$.

Related Question