Solved – Creating a probability transition matrix

markov-processr

Suppose that you have a bunch of states $A,B,C,D$ at time $t_1$ and a bunch of states $A,B,C,D$ at time $t_2$. How would you create a transition matrix in R to represent the probability of changing from various states across $t_1$ to $t_2$?

In a previous question, a package called markovchain is mentioned. But in the example given, it is not clear whether sequence represents states at one particular time or states at two different times. Would I have to concatenate the states at $t_1$ and $t_2$ in my example to get what is called sequence in the aforementioned question?

Best Answer

Modeling the system as a Markov chain, the maximum likelihood estimate for the $A \rightarrow B$ transition probability is simply the number of times you saw $B$ following $A$, divided by the number of times you saw any state following $A$. The transition matrix is a matrix $P$ where $P_{ij}$ contains the $i \rightarrow j$ transition probability. A simple way to calculate this is to construct the transition count matrix $N$, where $N_{ij}$ contains the number of times you observed an $i \rightarrow j$ transition. Then, calculate $P$ by dividing each row of $N$ by its sum.

Related Question