How is holding time $H_i$ defined in continuous-time Markov chain

definitionmarkov chainsrandom variablesstochastic-processes

I'm reading lecture note Continuous-Time Markov Chains:


A stochastic process $\{X(t): t \geq 0\}$ on $(\Omega, \mathcal F, \mathbb P)$ with discrete state space $\mathcal{S}$ is called a continuous-time Markvov chain if for all $(t,s,i,j) \in \mathbb R_+^2 \times \mathcal{S}^2$, we have
$$
\mathbb P[X(s+t)=j | X(s)=i,\{X(u): 0 \leq u<s\}]=\mathbb P[X(s+t)=j | X(s)=i]=P_{i j}(t)
$$

Assume $\mathcal{S}=\mathbb{Z}=\{\cdots,-2,-1,0,1,2, \cdots\}$. Suppose now that whenever a chain enters state $i \in \mathcal{S}$, independent of the past, the length of time spent in state $i$ is a continuous, strictly positive (and proper) random variable $H_{i}$ called the holding time in state $i$.


Could you please explain how random variable $H_i:\Omega \to \mathbb R$ is defined, i.e., what is $H_i (\omega)$?

Best Answer

I've found the answer and posted here to close this question.


The sequence of $\textit{jump times}$ $(T_n)_{n \in \mathbb N}$ is defined recursively by $T_0 = 0$ and $T_{n+1}=\inf \left\{t \ge T_n \mid X_t \neq X_{T_n}\right\}$ with $\inf \emptyset = \infty$. Notice that $T_n = \infty$ implies $T_{n + k} = \infty$ for all $k \in \mathbb N$. The times between jump times $T_{n+1} - T_n$ are called $\textit{holding times}$. The $\textit{embedded chain}$ $(Y_n)_{n \in \mathbb N}$ is given by

$$Y_{n}=\left\{\begin{array}{ll} X_{T_n} & \text{if} \quad T_{n}<\infty \\ \Delta & \text{if} \quad T_{n}=\infty \end{array} \quad \text{for all} \quad n \in \mathbb{N}\right.$$

where $\Delta$ is an arbitrary element not in $V$.