For a finite MC it holds that
aperiodic + irreducible $\Leftrightarrow$ ergodic $\Leftrightarrow$ regular
as you expected. For an infinite MC it holds that
aperiodic + irreducible + positive recurrent $\Leftrightarrow$ ergodic,
and being "regular" in the infinite setting would require a more precise definition.
................................ explanations following ................................
For every finite or inifinite Markov chain (MC) it holds that
$aperiodic + irreducible + positive~recurrent \Leftrightarrow ergodic$.
See for example here for a proof. For every finite MC, irreducibility already implies positive recurrence, see here for a proof.
Further, for every finite MC we have that
$aperiodic + irreducible \Leftrightarrow regular$.
Proof sketch: the definition of a finite irreducible MC gives that $\forall i, j \in \Omega : \exists k > 0 : P^k[i,j] > 0$.
However, there might be no $k$ such that all entries are simultaneously positive - due to periodicities. But if the chain is additionally aperiodic, it follows that
$\exists k > 0 : \forall i, j \in \Omega : P^k[i,j] > 0$,
which matches your definition of being regular.
Finally, I don't see a canonical way how you would generalize the property "regular" to infinite Markov chains. So, I just ignore the term "regular" for infinite chains here.
What you're confusing is the idea of the existance invariant measure and convergence to the invariant distribution.
A lot of Markov chains have invariant measures (in fact, on a finite state space you always have at least one, and in general for irreducibility a null recurrent chain will have an invariant measure and positive recurrent will have invariant distributions) but don't converge to the invariant measure from any measure other than the invariant measure. In the example you give, they are both irreducible, not aperiodic so you don't have convergence to the invariant distribution (but it does have an invariant distribution).
I'd recommend reading Ch. 1 of Norris' Markov Chains for more details.
Best Answer
Both of these are possible.
First, suppose we have a Markov chain with three states $A$, $B$, $C$ and a transition from any state to any other state (no loops). From state $A$ we can return back to $A$ in $2$ steps ($A \to B \to A$) or in $3$ steps ($A \to B \to C \to A$) and these have GCD $1$, so state $A$ is aperiodic; the same argument applies to other states.
Now take two copies of this Markov chain: states $A, B, C$ with transitions between any two of them, and three more states $A', B', C'$ with transitions between any two of them. This is still aperiodic for all the same reasons, but because we can't get from $\{A,B,C\}$ to $\{A', B', C'\}$, it's not irreducible.