[Math] A finite state space Markov chain has no null-recurrent states

markov chainsprobability

I'm fairly new to Markov chains. At the moment, I'm trying to understand why a finite state space Markov chain cannot have any null-recurrent states.

Searching on math.SE, I found this answer, which argues as follows:

Consider a communicating class $C$ containing a null, recurrent
state. Then $p_{ij}(n)\to 0$ for every $i,j\in C$, but since this class $C$ is closed
this gives the contradiction $$1=\lim_n \sum_j p_{ij}(n)=0.$$

My question is: How can we show that $p_{ij}(n) \to 0$ for all $i, j$ in $C$?

I suppose it's sufficient to show that, if $k$ is the null-recurrent state, then $p_{kk}(n) \to 0$. Indeed, for any $i, j \in C$, there exist $m_1$ and $m_2$ such that $p_{ki}(m_1) > 0$ and $p_{jk}(m_2) > 0$, so if $p_{kk}(n) \to 0$, then $$ p_{ij}(n) \leq \frac{p_{kk}(m_1 + n + m_2) }{ p_{ki}(m_1) p_{jk}(m_2) }\to 0.$$

But then, how do I show that $p_{kk}(n) \to 0$? The fact that $k$ is null-recurrent tells us that $\sum_{n = 1}^\infty n f_{kk}(n) = \infty$, where $f_{kk}(n)$ is the probability of the first return from $k$ to $k$ taking place after $n$ steps. I don't see the link between $\sum_{n = 1}^\infty n f_{kk}(n)$ being infinite and $p_{kk}(n)$ tending to zero.

Edit: I thought of using the fact that
$$ P_{kk}(s) = 1 + F_{kk}(s) P_{kk}(s), \ \ \ \ -1 < s \leq 1, $$
where
$$ P_{kk}(s) := \sum_{n=0}^\infty p_{kk}(n) s^n, \ \ \ \ \ \ \ F_{kk}(s) := \sum_{n=1}^\infty f_{kk}(n)s^n$$
See page 12 of these notes. If you rearrange, and differentiate, you get
$$ F_{kk}'(s) = – \frac{P_{kk}'(s)}{(P_{kk}(s))^2}. $$
If $\sum_{n = 1}^\infty n f_{kk}(n) = \infty$, then $F_{kk}'(1)$ must be infinite. But then I'm not sure what to do with the right-hand side.

Best Answer

I think that claim is not trivial. In the book referenced in your link, that statement is observed as a corollary to the ergodic theorem stating that $p_{ij}(n) \to \frac{1}{\mathbb E_x T_x}$ for any irreducible aperiodic Markov chain. You can find a proof of that result here, for instance.

If all you want to prove is your original claim (that all irreducible finite Markov chains are positive recurrent), I think there's an easier way to do it than by that lemma. Assume aperiodicity for simplicity, but periodic chains just make the proof more annoying (rather than prevent the result from being true). The sketch of the proof is:

  1. For an irreducible, aperiodic finite Markov chain, there exists some $m$ such that $p_{ij}(m) > 0$ for all state pairs $\{i, j\}$.

  2. Moreover, because the chain is finite you can strengthen the above to say that $p_{ij}(m) > \epsilon$ for some universal $\epsilon > 0$.

  3. Hence, you can dominate the excursions of $X_n$ from $x$ to $x$ with a geometric random variable by conditioning on the behavior on every $m^{\text{th}}$ step.

  4. Since the geometric random variable has a finite expectation, so do the excursion lengths.

EDIT: Expanding the answer to address the steps to show step 1 above. I'll work with the following definition of aperiodicity: a state $x$ is aperiodic if $\gcd(k:p_{xx}(k) > 0) = 1$. In all the following, I'll also assume irreducibility of the Markov chain. Without irreducibility, some version of these basic ideas all still hold, but they're just more obnoxious to write down; you'd just have to address an irreducible subclass of states.

Lemma: If state $x$ is aperiodic, then there exists $N$ such that $p_{xx}(n) > 0$ for all $n \geq N$.

Corollary: For the aperiodic state $x$ above, there exists $N'$ such that for all other states $y$ and all $n \geq N'$, $p_{xy}(n) > 0$.

Corollary to the corollary: If a chain has one aperiodic state, all states are aperiodic.