[Math] Mean return time in Markov chain

markov chainsmarkov-processprobabilityrandom variables

Given the following Markov chain:

$p_{0,1}=1$ (if we are in state 0, we must go to state 1)

$p_{i,i+1}=p_{i,i-1}=0.5$

There are infinitely (countably) many states.

I assume that $X_0=0$ and define $T=\inf\{n>0 : X_n=0\}$ the return time to state $0$.

I want to calculate $P(T=k)$ (My goal is to calculate the expected value of T). I think that $P(T=k)=0$ if $k$ is odd, but I have no idea how to calculate it exactly, I tried to think about $T$ as "almost" geometric law, with probability of $1/2^k$ to "win" at each step, but it's not fixed probability so I am not even sure it is defined…

Best Answer

It is easier to calculate the expectation directly than it is to write down the distribution. This is usually the case.

Since you haven't said that much about what you've tried yet, let me start with a hint. Try to calculate the expected time to hit $\{ 0,n \}$ starting from $x$ with $0<x<n$. Then send $n \to \infty$. You can set up a system of equations for this problem by conditioning on the first step, by using the total expectation formula and the Markov property.

Related Question