[Math] Markov Chains – Proof that a finite chain has at least one recurrent state

markov chainsprobability theory

I am looking at the proof in Grimmett and Stirzaker's Probability and Random Processes of the statement:

If $S$ is finite, then at least one state is persistent

The proof given is to assume by contradiction that all states are persistent, and then take the limit as $n \rightarrow \infty$ of $1 = \sum_j p_{ij}(n)$, giving: $1=lim_{n \rightarrow \infty}\sum_j p_{ij}(n)=0$,
the second equality being derived from an earlier corollary that if $j$ is transient then $p_{ij}(n) \rightarrow 0$ as $n \rightarrow \infty$ for all $i$.

The proof makes sense, but what restricts it to the finite case? Intuitively I get that if you have an infinite number of states, that even when $n$ gets large, there is always another state to jump to – but my understanding is that the corollary and the summation in this proof are true for infinite state spaces too

Best Answer

A sequence of infinite sums, whose terms each tend to zero, need not converge (as a sequence) to $0$. Consider the sequence of sum $S_n=\sum_{k\geq 1}a_{n,k}$ where

$$a_{n,k}=\cases{\frac{1}{n}&$1\leq k\leq n$\\0&otherwise}$$

Then for each fixed $n\geq 1$, $S_n=1$. Nonetheless, $\lim_{n\to\infty}a_{n,k}=0$ for all $k\geq 1$. Essentially, this means that, in general, one cannot change the order of limits for infinite sums:

$$\lim_{n\to\infty}\sum_{k\geq 1}a_{n,k}\neq\sum_{k\geq1}\lim_{n\to\infty}a_{n,k}$$