I got the following problem and solution from a slide in my Master's Analytic course, but it looked different from what I have learned about Markov Chain so far.
Problem:
Let $X_{i}=0$ if it rains on day $i$, otherwise $X_{i}=1$. Given a probability state transition matrix:
$$
\mathbf{P}=
\begin{pmatrix}
P_{00} & P_{01} \\
P_{10} & P_{11}
\end{pmatrix}
=
\begin{pmatrix}
0.7 & 0.3 \\
0.4 & 0.6
\end{pmatrix}
$$
Suppose that it rains on Monday, predict the weather for the rest of the work week.
My Solution:
I simply compute like the following, with $i=0,1,2,…$ correspond to Monday, Tuesday, … respectively:
$$
\begin{Bmatrix}
P(X_{i}=0) \\
P(X_{i}=1)
\end{Bmatrix}
=
\begin{Bmatrix}
1 \\
0
\end{Bmatrix}^{T}
\begin{pmatrix}
0.7 & 0.3 \\
0.4 & 0.6
\end{pmatrix}^{i}
$$
The Slide's Solution
The solution gives the following table
I am not sure what $U_{i}$ is and why we compare it with $P_{\cdot 0}$ to predict the weather. Does anyone know why we solve it like this?
Best Answer
I think that what the slide is doing is sampling from the Markov chain: randomly picking one possible way that the weather could go for the week, following the transition probabilities. The $U_i$ are just random numbers uniformly chosen from the interval $[0,1]$. Under this interpretation, here is how we read the table:
This is different from predicting the weather. I think that the matrix powers you are taking are a better fit for that question: they will tell you that
and so on.