[Math] Why are persistent states of a Markov chain on a finite state space non-null

markov chainsprobabilitystochastic-processes

i would like to understand the following statement about Markov chains on a finite state space S:

"If S is finite, then one state ist persistent and all persistent states are non-null."

It is more or less clear to me, that there must exist at least one persistent state. So my question reduces to: Why are all those persisten states non-null ?

Thank you for any help

Best Answer

The finiteness of the state space allows us to derive the finiteness of the expected hitting time from very rough bounds. A persistent state $s$ can be reached from any state. Since there are finitely many states, there is a maximal number $n$ of steps that need to be taken from any other state to reach $s$, and a minimal probability $p$ of reaching $s$ in $n$ steps from any other state. Then we can regard the Markov chain as an infinite sequence of attempts to reach $s$. Each attempt takes $n$ steps and has a chance of at least $p$ to hit $s$, so the probability of reaching $s$ after at most $kn$ steps is bounded from below by a geometric distribution, which has a finite expectation value.

Related Question