[Math] About the long run behaviour of markov chains

markov chainsprobabilitystatisticsstochastic-processestransition matrix

I'm stuck in the following exercise: 🙁

In certain town, a sunny day is followed by another sunny day with probability .9, whereas a rainy day is followed by another rainy day with probability .2. Supposing that there are only sunny or rainy days, in the long run of what fraction of days is it sunny?

And the "partial" solution is:

If $p$ is the long-term probability (aka equilibrium point) that it is sunny, then the probability that it is sunny on a following day is also $p$, so: $\Box p + \Box (1-p) = p$

Likewise the probability that it is not sunny on the subsequent day is: $\Box p + \Box (1-p) = (1-p)$.

The problem is I don't know how to fill the boxes with the given information; once the boxes are filled, I'm just gonna solve the system of equations for p.

Can somebody help me please 🙂

Best Answer

You need to have a fixed probability of being sunny. So you can calculate the probability that it is sunny today using the steady state probabilities for yesterday and demand it equals the steady state probability.

So if $p$ is the steady state probability it is sunny, you have $$ p(0.9) + (1-p)(0.8) = p$$

The left hand side gives you the contribution from yesterday being a sunny day (and it transitioning to another sunny day with probability $.9$) and from yesterday being rainy (and it transitioning to being sunny today with probability $.8.$).

Hopefully, solving this for $p$ gives you the same answer as the corresponding equation for today being rainy $$ p(0.1)+(1-p)(0.2) = 1-p $$

Related Question