I prefer the first definition by far. I relate the question to ergodic theory, as seems appropriate, and assume that the chain hass finitely many possible values, so as to not bother with positive recurrence.
Let us consider a finite state space $A$, and denote all the possible sequences of element in $A$ by $X:=A^{\mathbb{N}}$. Let us define a transformation $\sigma$ on $X$ by $(\sigma x)_n = x_{n+1}$ on $X$. For $x \in X$, we have $x_n = (\sigma^n x)_0$. In other words, by applying the transformation $\sigma$, I can read the successive values of a given sequence.
Now, let us take some probability measure $\mu$ on $A$ with full support (so as to see everything), and a stochastic matrix $P$ (the transition kernel). Using $\mu$ as the distribution of $X_0$ and the matrix $P$ to define transitions, we get a Markov chain $(X_n)_{n \geq 0} = x = ((\sigma^n x)_0)_{n \geq 0}$, which is a stochastic process with values in $A$. The distribution of $(X_n)_{n \geq 0}$ is a measure $\overline{\mu}$ on $A^{\mathbb{N}}$ which satisfies the usual conditions on cylinders, and whose first marginal is $\mu$.
The construction may look a bit confusing. However, if you forget about $\sigma$, it is what is done more or less informally when one defines Markov chains (that is the construction may be hidden, but it is there).
Hence, we can consider a Markov chain as a dynamical system $(X, \sigma)$ together with a probability measure $\overline{\mu}$. We can use the definitions of ergodic theory, and what we get in the end is that:
- the system $(X, \sigma, \overline{\mu})$ is measure-preserving if and only if $\mu$ is stationnary for $P$;
- the system $(X, \sigma, \overline{\mu})$ is ergodic (in the sense of ergodic theory) if and only if the Markov chain is irreducible;
- the system $(X, \sigma, \overline{\mu})$ is mixing if and only if the Markov chain is irreducible and aperiodic.
So these are two very different conditions, and aperiodicity does not correspond to ergodicity. As a corollary, one can apply ergodic theorems to Markov chains with no need for aperiodicity.
Take a random walk on the integers where the jump distribution satisfies
$$\mathbb{P}(\xi=-1)=\mathbb{P}(\xi=0)=\mathbb{P}(\xi=1)=1/3.$$
Allowing the walk to sit still with positive probability "kills" the periodicity.
This chain is null for the same reason as the simple, symmetric random walk. If the new chain were positive, it would have a unique invariant probability measure $\pi$. The only non-negative solution to $\pi=\pi P$ is a constant sequence $\pi(k)\equiv c$, which cannot be a probability measure.
Best Answer
I think so... Here's my thought.
Let $M$ be an irreducible, symmetric and positive-definite $n\times n$ stochastic matrix, with spectrum $\sigma(M)=\{\lambda_1, \lambda_2,\ldots, \lambda_n\}$.
Thus, there exists an invertible matrix $U$ s.t.:
$$M= U \cdot J\cdot U^{-1},$$ where $J=\begin{bmatrix} 1 &&&\\ & \lambda_2 & \\ &&\ddots &\\ &&&\lambda_n\end{bmatrix}$ is the Jordan normal form of $M$.
By Perron - Frobenius theorem for non - negative, irreducible matrices, we have that eigenvalue $\lambda_1=1$, which happens to be the spectral radius of $M$ has algebraic multiplicity $1$ and for all other eigenvalues we have $$|\lambda_i|=\lambda_i<1, \, i=2, \ldots, n.$$
Thus, $$\begin{array}[t]{l} M^k=U\cdot \begin{bmatrix} 1^k & && \\ & \lambda_2^k && \\ &&\ddots & \\ &&& \lambda_n^k\end{bmatrix}\cdot U^{-1}\\\\ \lim_{k\to\infty}M^k=U\cdot \begin{bmatrix} 1 & && \\ & 0 && \\ &&\ddots & \\ &&&0\end{bmatrix}\cdot U^{-1} \end{array}$$ must be a stochastic matrix, since $M^k$ is a stochastic matrix for every $k\in \mathbb N$.
Now it is easy to prove that $$ U \cdot \begin{bmatrix} 1 & && \\ & 0 && \\ &&\ddots & \\ &&&0\end{bmatrix}\cdot U^{-1}=\begin{bmatrix} \pi_1 & \pi_2 & \cdots & \pi_n\\ \pi_1 & \pi_2 & \cdots & \pi_n\\ \vdots & \vdots & \ddots & \vdots\\ \pi_1 & \pi_2 & \cdots & \pi_n\end{bmatrix}=\mathbf{\Pi}$$ plus $\displaystyle \sum_{i=1}^n \pi_i =1$.
Thus, $M$ must be aperiodic, since $\lim_{k\to\infty}M^k$ exists and equals to a stochastic matrix with identical rows.
Note: We can prove that $\mathbf{\Pi}$ has all its elements strictly positive, since $$\pi = \begin{bmatrix} \pi_1 & \cdots & \pi_n\end{bmatrix}$$ is a left eigenvector which corresponds to Perron-Frobenius eigenvalue $\lambda=1$.