[Math] Stopping time in Markov chains

markov chainsprobability theorystopping-times

A random variable $T : \Omega \rightarrow ${$1,2,3…$} $\cup$ {$ \infty$} is called a stopping time if the event {$T=n$} depends only on $X_0 , X_1 ,X_2 ,…, X_n$ for $n = 0,1,2,…$

I have trouble understanding this definition. What kind of dependence are we talking about? What does this stopping time signifies?

Best Answer

This means that based on the information about $X_1,X_2,\ldots X_n$ you can be sure whether $\{T=n\}$ has occured or not. In other words, if $X_1, X_2, \ldots,X_n$ have been realized, i.e. if we are at time $n$, then we can be sure whether the event $\{T=n\}$ applies or not. The usual definition of a stopping time is $$T=\inf\{n\in \mathbb N: \text{event E has occured}\}$$ So, this dependence says that the event $E$ which when it occurs you can stop, depends only at the $n$ first time periods. Examples

  1. Stopping time: $T_y=$ the first time you return/visit a specific state $y$.
  2. Non-stopping time: $W_y=T_y-1$ (see also here). Obviously $W_y$ depends also on $X_{n+1}$.
Related Question