[Math] Markov chain: relation between hitting time and transition probabilities

markov chainsstochastic-processes

Let $X_n$ be a Markov chain on a countable state space $E$. For $x\in E$ let $\tau_x:=\inf\{n\geq 1\vert X_n=x\}$ be the first hitting time.

What can be said about the relation between the transition matrix $p(x,y)$ of the Markov chain and the probability of the first hitting time $\mathbb P(\tau_x =k\vert X_0=y)$?

Is there a general explicit relation or is there a non-trivial class of Markov chains for which the relation can be calculated explicitly?

Best Answer

For every $k\geqslant1$ and $x$ and $y$ in $E$, let $h_k(x,y)=\mathbb P(\tau_y=k\mid X_0=x)$, then $h_1(x,y)=p(x,y)$ and, for every $k\geqslant1$, $h_{k+1}(x,y)=\sum\limits_{z\ne y}p(x,z)h_k(z,y)$.

Consider the matrices $p=(p(x,y))_{(x,y)\in E\times E}$, $h_k=(h_k(x,y))_{(x,y)\in E\times E}$ for every $k\geqslant1$, and $d_k$ the diagonal matrix with the diagonal of $h_k$. Then, $h_1=p$ and, for every $k\geqslant1$, $h_{k+1}=ph_k-pd_k$.

Related Question