[Math] How to read transition probability matrix for Markov chain

stochastic-processes

Suppose that whether or not it rains today depends on previous weather conditions through the last two days. So if $RR$ (rained yesterday and today), then it will rain tomorrow with probability $0.7$. For $SR$ (rained today but not yesterday), it will rain tomorrow with probability $0.5$. For $RS$, it will rain tomorrow with probability $0.4$; if $SS$ it will rain tomorrow with probability $0.2$.
The state at any time is determined by the weather conditions during both that day and the previous day.
$$P= \begin{matrix}0.7&0&0.3&0\\0.5&0&0.5&0\\0&0.4&0&0.6\\0&0.2&0&0.8\\ \end{matrix}$$

I am having difficulty reading this matrix correctly. Any help?

Thank you

Best Answer

\begin{matrix}&RR&SR&RS&SS\\RR&0.7&0&0.3&0\\SR&0.5&0&0.5&0\\RS&0&0.4&0&0.6\\SS&0&0.2&0&0.8\\ \end{matrix} You start by the row and end up by the column:

If the first condition is RR (rained yesterday and rained tomorrow), the probability of RR (rained today and will rain tomorow) is 0.7 or RS (rained today dry tomorow) 0.3.

Note that every row sums up to 1.0

Related Question