Solved – Long Run Proportion of Time in State of a Markov Chain

markov-processstationaritystochastic-processes

While reviewing for a stochastic processes exam, I came across the following proof in Introduction to Stochastic Processes with R by Dobrow.

The proof is for the theorem that the expected number of times an irreducible Markov chain visits a state $j$ starting from a state $i$ is given by the Markov chain's invariant distribution.

Here is the proof:

Let $(X_n)_{n\in\mathbb{N}}$ be a Markov chain with transition matrix $P$ and limiting distribution $\mathbf{\pi}$. For state $j$, define the indicator function

\begin{equation}
\mathbb{1}_k = 1, \quad \text{if} \ X_k = j, \quad k \in \mathbb{N}
\end{equation}

Then $\sum_{k=0}^{n-1}\mathbb{1}_k$ is the number of times the chain visits $j$ in the first $n$ steps (counting $X_0$ as the first step). From initial state $i$, the long term expected proportion of time that the chain visits $j$ is

\begin{align}
\lim_{n\to\infty} \mathbb{E}\left [ \frac{1}{n}\sum_{k=0}^{n-1}\mathbb{1}_k |X_0 = i\right ] &= \lim_{n\to\infty}\frac{1}{n}\sum_{k=0}^{n-1}\mathbb{E}[\mathbb{1}_k |X_0 = i] \\ &= \lim_{n\to\infty}\frac{1}{n}\sum_{k=0}^{n-1} \mathbb{P}[X_k=j|X_0=i] \\ &= \lim_{n\to\infty}\frac{1}{n}\sum_{k=0}^{n-1} P^{k}_{ij} \\ &= \lim_{n\to\infty}P^n_{ij} = \mathbf{\pi}_j
\end{align}

where the factor of $1/n$ disappears in the last equality due to Cesaro's lemma. My point of confusion is the last equality. To my mind,

$\sum_{k=0}^{n-1} P^{k}_{ij} = (I + P + P^2 +… + P^{n-1})_{ij}$

where $I$ is the identity matrix. So I cannot understand how Dobrow arrives at

$\sum_{k=0}^{n-1} P^{k}_{ij} = P^n_{ij}$

Could someone help me to understand this?

Best Answer

$\sum_{k=0}^{n-1} P^{k}_{ij} $ is a sum of constants. For each $k$, $P_{ij}^k$ is a scalar element (the $ij$th element of the $k$th step transition matrix. This follows directly from Cesaro's Lemma, as you say.

Related Question