Ergodic Markov Chain vs Regular Markov Chain

markov chains

I am trying to understand the difference between regular Markov Chain and Ergodic Markov Chain.

According to https://math.dartmouth.edu/archive/m20x06/public_html/Lecture15.pdf:

A Markov chain is called an ergodic chain if it is possible to go from
every state to every state (not necessarily in one move).

and

A Markov chain is called a regular chain if some power of the
transition matrix has only positive elements.

It appears to me they are equivalents:

  1. If a Markov chain is regular, then some power of the transition matrix has only positive elements, which implies that we can go from every state to any other state.
  2. If a Markov chain is ergodic, then we can go from every state to every other state. Let the maximum number of steps to go from one state to the other be k, then the transition matrix to the kth power should have all positive entries.

I have to be missing something.

Best Answer

The third slide gives you a chain that is ergodic but not regular.

The flaw in your argument is for $P^k$ to have all positive elements, every state must be reachable from any other state in exactly $k$ steps, not at most $k$ steps.