For a finite MC it holds that
aperiodic + irreducible $\Leftrightarrow$ ergodic $\Leftrightarrow$ regular
as you expected. For an infinite MC it holds that
aperiodic + irreducible + positive recurrent $\Leftrightarrow$ ergodic,
and being "regular" in the infinite setting would require a more precise definition.
................................ explanations following ................................
For every finite or inifinite Markov chain (MC) it holds that
$aperiodic + irreducible + positive~recurrent \Leftrightarrow ergodic$.
See for example here for a proof. For every finite MC, irreducibility already implies positive recurrence, see here for a proof.
Further, for every finite MC we have that
$aperiodic + irreducible \Leftrightarrow regular$.
Proof sketch: the definition of a finite irreducible MC gives that $\forall i, j \in \Omega : \exists k > 0 : P^k[i,j] > 0$.
However, there might be no $k$ such that all entries are simultaneously positive - due to periodicities. But if the chain is additionally aperiodic, it follows that
$\exists k > 0 : \forall i, j \in \Omega : P^k[i,j] > 0$,
which matches your definition of being regular.
Finally, I don't see a canonical way how you would generalize the property "regular" to infinite Markov chains. So, I just ignore the term "regular" for infinite chains here.
We know that if a (finite state space) Markov Chain is aperiodic, then there is some $n_0$ s.t. for all $n\ge n_0$ and all states $i$, $p_{ii}^n>0$.
Let $i\ne j$ be states of the Markov Chain. Let $m_{ij}$ be the shortest time to go from $i$ to $j$, ie, the mÃnimum of all $m$ s.t. $p_{ij}^m>0$. (which is finite because the chain is irreducible), and let $M:=\max_{i,j}m_{ij}$.
Now, let $n = M + n_0$ and $i\ne j$. We can write $p_{ij}^{n}=p_{ij}^{(n-m_{ij})+m_{ij}}$. By Chapman-Kolmogorov, we know $p_{ij}^{(n-m_{ij})+m_{ij}}=\sum_{k}p_{ik}^{n-m_{ij}}p_{kj}^{m_{ij}}$. Hence,
$$
p_{ij}^{n}=p_{ij}^{(n-m_{ij})+m_{ij}}=\sum_{k}p_{ik}^{n-m_{ij}}p_{kj}^{m_{ij}}\ge p_{ii}^{n-m_{ij}}p_{ij}^{m_{ij}}>0
$$
as $n-m_{ij} = (M - m_{ij})+n_0\ge n_0$.
Hence, $P^{M + n_0}>0$, ie, the chain is regular.
Best Answer
I prefer the first definition by far. I relate the question to ergodic theory, as seems appropriate, and assume that the chain hass finitely many possible values, so as to not bother with positive recurrence.
Let us consider a finite state space $A$, and denote all the possible sequences of element in $A$ by $X:=A^{\mathbb{N}}$. Let us define a transformation $\sigma$ on $X$ by $(\sigma x)_n = x_{n+1}$ on $X$. For $x \in X$, we have $x_n = (\sigma^n x)_0$. In other words, by applying the transformation $\sigma$, I can read the successive values of a given sequence.
Now, let us take some probability measure $\mu$ on $A$ with full support (so as to see everything), and a stochastic matrix $P$ (the transition kernel). Using $\mu$ as the distribution of $X_0$ and the matrix $P$ to define transitions, we get a Markov chain $(X_n)_{n \geq 0} = x = ((\sigma^n x)_0)_{n \geq 0}$, which is a stochastic process with values in $A$. The distribution of $(X_n)_{n \geq 0}$ is a measure $\overline{\mu}$ on $A^{\mathbb{N}}$ which satisfies the usual conditions on cylinders, and whose first marginal is $\mu$.
The construction may look a bit confusing. However, if you forget about $\sigma$, it is what is done more or less informally when one defines Markov chains (that is the construction may be hidden, but it is there).
Hence, we can consider a Markov chain as a dynamical system $(X, \sigma)$ together with a probability measure $\overline{\mu}$. We can use the definitions of ergodic theory, and what we get in the end is that:
So these are two very different conditions, and aperiodicity does not correspond to ergodicity. As a corollary, one can apply ergodic theorems to Markov chains with no need for aperiodicity.