[Math] How to calculate the expected number of changes of state of a discrete-time Markov chain

markov chainsprobability

Assume we have a 2 state Markov chain with the transition matrix:
$$
\left[
\begin{array}
(p & 1-p\\
1-q & q
\end{array}
\right]
$$
and we assume that the first state is the starting state. What is the expected number of state transitions in $T$ periods? (I want to count the number of state changes from state 1 to state 2, and the other way round).

Best Answer

1. Let $s=(2-p-q)^{-1}$, then $\pi_0=(1-q)s$, $\pi_1=(1-p)s$, defines a stationary distribution $\pi$. If the initial distribution is $\pi$, at each step the distribution is $\pi$ hence the probability that a jump occurs is $$ r=(1-p)\pi_0+(1-q)\pi_1=2(1-p)(1-q)s. $$ In particular, the mean number of jumps during $T$ periods is exactly $rT$.

2. By a coupling argument, for any distribution $\nu$, the difference between the number of jumps of the Markov chain with initial distribution $\nu$ and the number of jumps of the Markov chain with initial distribution $\pi$ is exponentially integrable.

3. Hence, for any starting distribution, the mean number of jumps during $T$ periods is $rT+O(1)$.

4. Note that $$ \frac1r=\frac12\left(\frac1{1-p}+\frac1{1-q}\right), $$ and that $\frac1{1-p}$ and $\frac1{1-q}$ are the mean lengths of the intervals during which the chain stays at $0$ and $1$ before a transition occurs to $1$ and $0$ respectively. Hence the formula. For a Markov chain with $n$ states and transition matrix $Q$, one would get $$ \frac1r=\frac1n\sum_{k=1}^n\frac1{1-Q_{kk}}. $$