Confusion about definition of ergodicity in Markov Chains

definitionmarkov chainsmarkov-process

On Wikipedia I found the following section on ergodicity

Ergodicity

A state i is said to be ergodic if it is aperiodic and positive
recurrent (*). In other words, a state i is ergodic if it is recurrent,
has a period of 1, and has finite mean recurrence time. If all states
in an irreducible Markov chain are ergodic, then the chain is said to
be ergodic.[dubious – discuss]

It can be shown that a finite state irreducible Markov chain is
ergodic if it has an aperiodic state. More generally, a Markov chain
is ergodic if there is a number N such that any state can be reached
from any other state in any number of steps less or equal to a number
N (**). In case of a fully connected transition matrix, where all
transitions have a non-zero probability, this condition is fulfilled
with N = 1.

A Markov chain with more than one state and just one out-going
transition per state is either not irreducible or not aperiodic, hence
cannot be ergodic.

I added the asterisks myself. What's confusing to me is that (**) seems much less strict than (*) because it doesn't mention periodicity. For example the transition matrix
$$P=\pmatrix{0&1&0\\0&0&1\\1&0&0}$$
seems to fullfil (**) because every state can reach any other state in 3 steps. Is (**) correct? Am I reading this wrong?

Best Answer

Yes, the definition given there is confusing. (**) is more general than (*).

Actually there are two definitions there. This reflects the fact that there are several inequivalent definitions of ergodicity for Markov chains in the literature.

Some authors ask for a single absorbing class, which is sligthly more general than (**), others say that a chain is ergodic iff it is both irreducible, positively recurrent and aperiodic, which is (*).