[Math] If a state is transient, then all states are transient.

markov chains

I'm a bit confused by this theorem, not sure of the name in english but here are the details:

Let the markov chain $X = (X_n)_{n\geq 0}$ with transition matrix $\mathbb{P}$ and state space $S$.

Let $X$ be irreducible then, if a state in $S$ is transient, then all states are transient. And every state will only be visited a finite number of times.

I'm not quite sure how a markov chain could satisfy this property. I can see how a subset of states might be transient, but to have a markov chain you need at least one recurrent state, perhaps I'm misunderstanding something.

If such a markov chain exists, then I'd like to see an example of one, I've tried searching but I can't seem to find anything on this.

Any help would be greatly appreciated.

Best Answer

The assumptions that $X$ is irreducible and that there exists a transient state $z$ imply, that $S$ is not finite. The point is the following: Assume there is a recurrent state $x$. Then there is a $m$ such that you reach $z$ in $m$ steps from $x$ with probability $p>0$. So each time you return to $x$, you throw a $p$-coin if you will reach $z$ in $m$ steps.
As an example, onsider the random walk on $\mathbb{Z}$ which goes to the right with probability $0.99$ and left with probability $0.01$. It should be clear, that the chain goes to $\infty$ then.

Related Question