Linear Algebra – Why Every Irreducible Matrix with Period 1 is Primitive

graph theorylinear algebramarkov chains

In a certain text on Perron-Frobenius theory, it is postulated that every irreducible nonnegative matrix with period $1$ is primitive and this proposition is said to be obvious. However, when I tried to prove the theorem by myself, I found that I was unable to come up with a formal proof. Intuitively, of course, I am sure that the theorem holds (and the converse statement is in fact easy to prove), but I am unable to prove it.

According to the text, the theorem should be obvious from the graph-theoretic interpretation of these notions. For a given nonnegative matrix, it is possible to construct a digraph as follows: There is an edge from the $i$-th vertex to the $j$-th vertex if and only if the entry $(i,j)$ of the matrix is positive. Thus, the matrix is irreducible if and only if its digraph is strongly connected. The period is defined to be the greatest common divisor of lengths of cycles (more precisely, the closed paths) of the graph. And finally, the matrix is said to be primitive if there is a positive integer $n$ such that for each pair of vertices of the graph there is a path of length $n$ interconnecting these vertices.

The theorem to be proved can thus be restated as follows: For every strongly connected digraph with the greatest common divisor of the lengths of closed paths equal to $1$, there is a positive integer $n$ such that for every pair of vertices of the digraph there is a path of length $n$ interconnecting these vertices.

It seems to me that the theorem might be proved by means of number theory, but I have not been able to find a proof up to now. To be more specific, I am looking for a proof without the use of the Perron-Frobenius theorem (the proposition is used in the text to prove the Perron-Frobenius theorem). Any ideas?

Thank you in advance.

Best Answer

Here is the argument given in section 1.3 of Gregory F. Lawler's Introduction to Stochastic Processes. It treats stochastic matrices $P$, but I think the argument applies to general non-negative matrices.

For each state $i$ define $J_i=\{n: P^n(i,i)>0\}$. This is a semigroup and since we have assumed that $P$ is aperiodic, we have $\gcd(J_i)=1$ and it follows that $J_i$ contains all sufficiently large integers. That is, there is some integer $M(i)$ so that for all $n\geq M(i)$ we have $P^n(i,i)>0$.

Since $P$ is irreducible, there exists some $m(i,j)$ such that $P^{m(i,j)}(i,j)>0$. Hence for $n\geq M(i)$, $$P^{n+m(i,j)}(i,j)\geq P^n(i,i)\,P^{m(i,j)}(i,j)>0.$$

Let $M$ be the maximum value of $M(i)+m(i,j)$ over all pairs $(i,j)$. Then for $n\geq M$, $P^n(i,j)>0$ for all states $i,j$.

Essentially the same argument is found in section 1.3 of Markov Chains and Mixing Times by Levin, Peres, and Wilmer. So it looks like probabilists have not found a better proof.

Related Question