Write probability of first return time in terms of first hitting time

markov chainsprobabilityprobability theorystochastic-processesstopping-times

For a time homogeneous Markov chain $(X_n)_{n\ge 0}$ with state space $I$ with no self loop . Given $X_0 = i \in I$ , define
the first return time $T_i = \inf\{n\ge 1 : X_n = i\} $ and first hitting time
$H_i = \inf\{n\ge 0 : X_n = i\} $ . I want to see whether the following equalities hold ,
$$
P(T_i < \infty | X_0 = i) = \sum_{j \in I} P(T_i < \infty | X_1 = j , X_0 = i)P(X_1=j|X_0=i)
$$

$$
= \sum_{j \in I} P(T_i < \infty | X_1 = j)P(X_1=j|X_0=i)
$$

$$
= \sum_{j \in I} P(H_i < \infty |X_0 = j)P(X_1=j|X_0=i)
$$

The first equality seems to hold because of the union of disjoint sets in domain of Markov chain
$\{T_i<\infty |\; X_0=i\} = \cup_{j\in I} \{T_i<\infty |\; X_0=i\}\cap\{X_1=j\}$ .

The second equality seems to hold because of Markov property .

The third equality seems to hold because of time homogeneity and seemingly $P(H_i<\infty|X_0=j\neq i) = P(T_i<\infty|X_0=j\neq i) $ , but I could not prove it .

Best Answer

I'd write like this: \begin{align} P_i(T_i < \infty) &= \sum_{j \in I}P_i(T_i < \infty \mid X_1 = j)P_i(X_1 = j) \end{align} Now, in order to use the Markov property, we write $T_i$ as $T_i(X_{0 + \cdot})$. Note that $T_i(X_{0 + \cdot}) = H_i(X_{1 + \cdot}) + 1$, so $\{T_i(X_{0 + \cdot}) < \infty\} = \{H_i(X_{1 + \cdot}) < \infty\}$. Now we can formally apply the Markov property to get \begin{align} P_i(T_i(X_{0 + \cdot}) < \infty) &= \sum_{j \in I}P_i(T_i(X_{0 + \cdot}) < \infty \mid X_1 = j)P_i(X_1 = j) \\ &= \sum_{j \in I}P_i(H_i(X_{1 + \cdot}) < \infty \mid X_1 = j)p(i,j) \\ &= \sum_{j \in I}P_j(H_i(X_{0 + \cdot}) < \infty)p(i, j) \\ &= \sum_{j \in I}P_j(H_i < \infty)p(i, j). \end{align}

Related Question