Conditions for which groups of states of markov chain still represent a markov chain

markov chainsmarkov-processprobabilityprobability theorystochastic-processes

Consider a 3×3 Markov chain matrix for 3 different states $j=1,2,3$.

$$ W= \left(\begin{matrix}W_{11}&W_{12}&W_{13}\\W_{21}&W_{22}&W_{23}\\W_{31}&W_{32}&W_{33}\end{matrix}\right) $$

If I define two states $I,II$ as

  • $I$ is $1$
  • $II$ is obtained joining $2$ and $3$

What are the conditions on the matrix elements $W_{ij}$ under which $I$ and $II$ are the states of a new Markov chain?


I think that the conditions are

  • $W_{21}=W_{31}$
  • $W_{22}=W_{33}$ and $W_{23}=W_{32}$

Otherwise the probabilities $W_{II,I}$ and $W_{II,II}$ would not be defined.


But I would like to have an explanation on how to derive those conditions from the definition of Markov chain, i.e. the transition probability must not depend on past states.

Best Answer

Definitely you need to have $W_{21}=W_{31}$, so as to make sure that the probability for the transition from state II to state I doesn't depend on whether state II was an instance of state $2$ or an instance of state $3$.

But as I see it, that's all you need.

Ler $a=W_{11}$, and let $b=W_{21}=W_{31}$.

Then the Markov chain for the new states has transition matrix $$ \pmatrix { a & 1-a \cr b & 1-b } $$