[Math] The expected number of visits before hitting zero in simple random walk

markov chainsprobability theory

I am learning Markov chains and encounter the following problem:
Suppose in simple random walk, we start from state k. What's the expected number of visits to k before we hit 0? The book does not mention if the time 0 counts for one visit, but I assume it does.

This question has a first part:The expected number of visits to k between two successive visits to 0 is 1 for all k. The way this part is solved is by defining a stationary measure mu(x)= starting from 0, the expected number of visits to x before we hit 0 again since the starting state 0 is recurrent. This is Theorem 6.5.2 in Rick Durrett's book. Thm 6.5.3 in the book asserts that if the chain is irreducible and recurrent then the stationary measure is unique up to constant multiples. So mu is constant. But mu(0)=P0(0)=1 which implies mu(k)=1 for all k which is not 0.

Thank you.

Best Answer

WOLOG, assume $k>0$

(By induction) When $k = 1$, the expectation is $\mu_0(0) + \frac{1}{2} \mu_0(0) + \frac{1}{2^2} \mu_0(0) + ... = 2$. It means whenever visits $1$, it has $1/2$ probability to do the cycle again, and $1/2$ probability hits $0$ and ends.

Suppose in $k-1$, the expectation is $2(k-1)$. Then, since whenever hit $0$, we must hit $1$ first. Thus in $k$, the expectation is $E_{k-1} + \mu_o(k) + \frac{1}{2} \mu_0(k) + \frac{1}{2^2} \mu_0(k) + ... $. Means after it visit $1$, it has $1/2$ probability to going up and $1/2$ probability to hits $0$ and ends. As $\mu_0(k) = 1$, we have the expectation is $2k$.