[Math] Definition of Stationary Distributions of a Markov Chain

markov chainsstochastic-processes

I'm having a lot of trouble understanding the definition of the stationary distribution of a Markov Chain from Hoel, Port, Stone's Introduction to Stochastic Processes.

They define the stationary distribution of a Markov Chain as

Let Xn, n ≥ 0, be a Markov chain having state space S and transition function P.

If π(x), x ∈ S, are nonnegative numbers summing to one, and if

x π(x)P(x, y) = π(y), y ∈ S,

then π is called a stationary distribution.

What I don't understand is what are π(x) and π(y)? Are they the initial distributions of the chain?

Best Answer

That is a really convoluted way of describing a steady state in my opinion.

The set of potential states a system can be in is called $S$. For example, $S$ could be "how many animals are there in your socks".

They seem to be describing a system that proceeds in discrete steps (discrete time markov). So a step could be going from one day to the next.

$\pi_k(x)$ is the probability of being in state $x$ on step $k$. So there might be a 10% probability that you have 4 kittens in your socks on Tuesday: $\pi_{\text{Tuesday}}(4) = .1$.

$P(x,y)$ is the probability that you proceed from state $x$ to state $y$. So if you have 3 kittens in your socks on one day, there is a 14% chance that you'll have 5 the next day. $P(3,5) = .14$.

Every step has a probability associated with being in a certain state. If your socks can hold at most $5$ kittens, then $\pi(0) + \pi(1) + \pi(2) + \pi(3) + \pi(4) + \pi(5) = 1$. That's just basic probability, the sum of all possibilities is 100%.

Knowing the transition probabilities $P$, and the probability of states of a certain step $\pi_k()$, then you can calculate the probability of states in the next step, $\pi_{k+1}()$. Specifically, $\pi_{k+1}(y) = \sum_{x \in S} \pi_{k}(x)P(x,y)$. "The probability of having 2 kittens in your sockets is the probability of having 0 kittens times the probability of transitioning from 0 to 2 plus the probability of having 1 times the probability of transitioning from 1 to 2 plus ...".

When $\forall x ~~\pi_{k+1}(x) = \pi_k(x)$, then that $\pi$ is called a steady state. In that state, the transitions don't change the probabilty of each state from step to step.

Example:

You can have at most 2 kittens in your socks.

  • If you have zero kittens in your sockets, then the probability that you have 0 the next day is 10%, 1 is 10%, 2 is 80%
  • If you have one kitten in your sockets, then the probability that you have 0 the next day is 10%, 1 is 20%, 2 is 70%
  • If you have two kittens in your sockets, then the probability that you have 0 the next day is 30%, 1 is 30%, 2 is 40%

Then a steady state is :

$$\pi(0) = \frac{27}{128},\quad \pi(1) = \frac{15}{64},\quad \pi(2) = \frac{71}{128}$$

As you can check, using the given formula.

Related Question