[Math] Finding first return time in an infinite markov chain

markov chainsprobabilitystochastic-processes

enter image description here

I have an infinite markov chain that looks like this. I need to show that the chain is recurrent by computing the first return time to 0 for the chain that started at 0. Intuitively to me, this makes sense because any state will eventually return to state 0. However, I am not quite sure how to formulate this for formally, using definitions of recurrence and first return time.

I need to calculate,
$
E(T_0 |X_0 = 0)
$
where $T_0$ is the first passage time to state 0.

I set up a set of equations using first-step analysis…

\begin{align}
T_{00} =E(T_0 |X_0 = 0) =1+1(T_{10}) \\ T_{10} = E(T_0 |X_0 = 1) = 1+ \frac{1}{2}+\frac{1}{2}(T_{20})
\end{align}

However, I realized that I'll have infinite number of equations like this and I am not quite sure how to approach this.

Do I just have a completely wrong understanding of what "first return time" is? Thanks for any help!

Best Answer

To figure out how recurrent this Markov chain isn't, you'll probably want to know two things:

  1. The probability that, starting at $0$, you'll ever return to $0$.
  2. The expected number of steps it takes to return to $0$.

In this Markov chain, it's very clear what the path has to be if we never return to $0$, and therefore (1) is easy to solve. Let $T_0$ be the number of steps to return to $0$, with $T_0=\infty$ if we never do. Then \begin{align} \Pr[T_0 \ge 1] &= 1 & \text{($T_0$ can never be 0)} \\ \Pr[T_0 \ge 2] &= 1 & \text{(going $0 \to 1$)} \\ \Pr[T_0 \ge 3] &= 1 \cdot \tfrac12 & \text{(going $0 \to 1 \to 2$)} \\ \Pr[T_0 \ge 4] &= 1 \cdot \tfrac12 \cdot \tfrac23 = \tfrac13 & \text{(going $0 \to 1 \to 2 \to 3$)} \\ \Pr[T_0 \ge k+1] &= 1 \cdot \tfrac12 \cdots \tfrac{k-1}{k} = \tfrac1{k} & \text{(going $0 \to 1 \to \dots \to k$)} \end{align} and because $\lim_{k \to \infty} \Pr[T_0 \ge k] = \lim_{k \to \infty} \frac1{k-1} = 0$, we know that $\Pr[T_0{=}\infty] = 0$: with probability $1$, we do return to $0$ eventually.

To figure out (2), the expected number of steps it takes to return to $0$, it's easiest to use the formula $$ \mathbb E[X] = \sum_{k=1}^\infty k \cdot \Pr[X=k] = \sum_{k=1}^\infty \sum_{j=1}^k \Pr[X=k] = \sum_{j=1}^\infty \sum_{k=j}^\infty \Pr[X=k] = \sum_{j=1}^\infty \Pr[X \ge j]. $$ In this case, $$ \sum_{j=1}^\infty \Pr[T_0\ge j] = 1 + 1 + \frac12 + \frac13 + \frac14 + \frac15 + \dotsb $$ which is the harmonic series, which diverges. So $\mathbb E[T_0] = \infty$, which means that the Markov chain is null-recurrent: it returns to $0$ with probability $1$, but the expected number of steps until it does so is infinite.