Since my comment provided sufficient clarification:
When there's a stationary state, your system will evolve towards that state. In your case, the two left eigenvectors are $(−1,1)$ and $(3,10)$ with corresponding eigenvalues $−0.3$ and $1$. Every other state of the system can be decomposed into those two states. The first state exihibits oscillating behaviour, but it dies out as $0.3<1$. The other state is stationary. So whatever your initial state, you'll evolve towards that stationary state.
For a finite MC it holds that
aperiodic + irreducible $\Leftrightarrow$ ergodic $\Leftrightarrow$ regular
as you expected. For an infinite MC it holds that
aperiodic + irreducible + positive recurrent $\Leftrightarrow$ ergodic,
and being "regular" in the infinite setting would require a more precise definition.
................................ explanations following ................................
For every finite or inifinite Markov chain (MC) it holds that
$aperiodic + irreducible + positive~recurrent \Leftrightarrow ergodic$.
See for example here for a proof. For every finite MC, irreducibility already implies positive recurrence, see here for a proof.
Further, for every finite MC we have that
$aperiodic + irreducible \Leftrightarrow regular$.
Proof sketch: the definition of a finite irreducible MC gives that $\forall i, j \in \Omega : \exists k > 0 : P^k[i,j] > 0$.
However, there might be no $k$ such that all entries are simultaneously positive - due to periodicities. But if the chain is additionally aperiodic, it follows that
$\exists k > 0 : \forall i, j \in \Omega : P^k[i,j] > 0$,
which matches your definition of being regular.
Finally, I don't see a canonical way how you would generalize the property "regular" to infinite Markov chains. So, I just ignore the term "regular" for infinite chains here.
Best Answer
Double check the definition of the period. First off, a period always refers to a particular state $i$. Specifically it's the GCD of all times $k$ for which a return is possible. So figuring out the "shortest time of return" is not sufficient.
Next, a state $i$ is aperiodic if its period is 1. A Markov Chain is aperiodic if all states have period 1.
In your example, it's possible to start at 0 and return to 0 in 2 or 3 steps, therefore 0 has period 1. Similarly, 1 and 2 also have period 1. So the Markov chain is aperiodic.