[Math] Recurrence of infinite markov chain

markov chains

I have a Markov chain with state space $S=\{0,1,2…\}$ and a sequence of positive numbers $p_1,p_2,…$ where $\sum p_i=1$. The transition probabilities are based on these where

$p(x,x-1)=1, x>0$

$p(0,x)=p_x, x>0$

Is this chain recurrent? What conditions on $p_x$ would make it positively recurrent?


I've figured that this means I have a Markov Chain where state 0 can go to any other state with probability $p_x$ but any other state will automatically go to the previous state until it gets back to state 0 and the process starts over. My feeling is that this MC is recurrent because there is zero probability that it will run off, it will always return back down the chain (please help me with the reasoning here).

I also feel like it would be positively recurrent if all $p_x$ where equal but I'm not even sure how to say that since x goes to infinity which would make all $p_x=0$..

Any help or direction would be greatly appreciated. I'm still trying to get a grasp on infinite markov chains.

Best Answer

Well, as you point out, the chain is constructed so that $0$ is recurrent. Then just notice that recurrence is a class property.

Related Question