That is a really convoluted way of describing a steady state in my opinion.
The set of potential states a system can be in is called $S$. For example, $S$ could be "how many animals are there in your socks".
They seem to be describing a system that proceeds in discrete steps (discrete time markov). So a step could be going from one day to the next.
$\pi_k(x)$ is the probability of being in state $x$ on step $k$. So there might be a 10% probability that you have 4 kittens in your socks on Tuesday: $\pi_{\text{Tuesday}}(4) = .1$.
$P(x,y)$ is the probability that you proceed from state $x$ to state $y$. So if you have 3 kittens in your socks on one day, there is a 14% chance that you'll have 5 the next day. $P(3,5) = .14$.
Every step has a probability associated with being in a certain state. If your socks can hold at most $5$ kittens, then $\pi(0) + \pi(1) + \pi(2) + \pi(3) + \pi(4) + \pi(5) = 1$. That's just basic probability, the sum of all possibilities is 100%.
Knowing the transition probabilities $P$, and the probability of states of a certain step $\pi_k()$, then you can calculate the probability of states in the next step, $\pi_{k+1}()$. Specifically, $\pi_{k+1}(y) = \sum_{x \in S} \pi_{k}(x)P(x,y)$. "The probability of having 2 kittens in your sockets is the probability of having 0 kittens times the probability of transitioning from 0 to 2 plus the probability of having 1 times the probability of transitioning from 1 to 2 plus ...".
When $\forall x ~~\pi_{k+1}(x) = \pi_k(x)$, then that $\pi$ is called a steady state. In that state, the transitions don't change the probabilty of each state from step to step.
Example:
You can have at most 2 kittens in your socks.
- If you have zero kittens in your sockets, then the probability that you have 0 the next day is 10%, 1 is 10%, 2 is 80%
- If you have one kitten in your sockets, then the probability that you have 0 the next day is 10%, 1 is 20%, 2 is 70%
- If you have two kittens in your sockets, then the probability that you have 0 the next day is 30%, 1 is 30%, 2 is 40%
Then a steady state is :
$$\pi(0) = \frac{27}{128},\quad \pi(1) = \frac{15}{64},\quad \pi(2) = \frac{71}{128}$$
As you can check, using the given formula.
Double check the definition of the period. First off, a period always refers to a particular state $i$. Specifically it's the GCD of all times $k$ for which a return is possible. So figuring out the "shortest time of return" is not sufficient.
Next, a state $i$ is aperiodic if its period is 1. A Markov Chain is aperiodic if all states have period 1.
In your example, it's possible to start at 0 and return to 0 in 2 or 3 steps, therefore 0 has period 1. Similarly, 1 and 2 also have period 1. So the Markov chain is aperiodic.
Best Answer
I have found out the error and the question can be closed.
The state vector x should be a row vector left-multiply the transition matrix P, rather than a column vector right-multiply the matrix. In this way, since every row of $P_{\infty}$ is exactly the same, x is indifferent to the multiplication.