[Math] Return time of a markov chain

markov chains

I'm having trouble deriving the return time for a Markov chain. The graph has $n$ vertices and is connected by $n – 1$ edges. So we can draw this as a horizontal line of nodes with node $1$ all the way to the left and node $n$ all the way to the right. At each intermediate node $i$ there is a $1/2$ probability that it moves to node $i – 1$ and a $1/2$ probability it moves to $i + 1$ and at node $1$ and $n$ there is a probability of $1$ that it moves to node $2$ and $n-1$ respectively.

I have to derive an equation for the expected return time that I return to node $1$ starting from node $1$.

I'm just having trouble starting so any hints toward the right direction will be very helpful.

Thanks!

Best Answer

Let $T = \min\{t \ge 0 : X_t = 1\}$ be the first time that you reach state 1, and let $h(i) = E_i T$ be the expected amount of time to reach state 1 when starting from state $i$. (As we've defined it, $h(1) = 0$; don't worry about this for now.)

By conditioning on $X_1$ (i.e. using the fact that $E_i T = \sum_j E_i[T \mid X_1 = j] P_i(X_1 = j)$) together with the Markov property, find a relationship between $h(i)$ and $h(i \pm 1)$. (Be careful to note the special cases when $i=1$ or $i=n$.) This will give you a system of $n$ linear equations which you can solve to find the $h(i)$.

Finally, when starting from state 1, the first step must be to state 2. So the expected return time is simply $h(2) + 1$.