Stochastic Processes – Proof of Doubly Stochastic Matrix

markov chainsproof-writingself-learningstochastic-processes

A transition matrix $P$ is said to be doubly stochastic if the sum
over each column equals one, that is $\sum_i P_{ij}=1\space\forall i$.
If such a chain is irreducible and aperiodic and consists of $M+1$
states $0,1,\dots,M$ show that the limiting probabilities are given by
$$\pi_j=\frac{1}{1+M},j=0,1,\dots,M$$

I have no idea how to prove it but

  • If a chain is irreducible then all states communicate i.e $$P_{ij}>0\space \text{and}\space P_{ji}>0\space\forall i,j$$

  • If $d$ denotes the period of any state, if a chain is irreducible aperiodic, then $d(i)=1\forall i$

If $P_{(M+1)\times (M+1)}$ matrix and $\pi$ is the stationary distribution
$$\pi_j=\sum_iP_{ij}\pi_i\space j=0,1,\dots,M+1$$

but how I can get this expression?

Best Answer

Proof:

We first must note that $\pi_j$ is the unique solution to $\pi_j=\sum \limits_{i=0} \pi_i P_{ij}$ and $\sum \limits_{i=0}\pi_i=1$.

Let's use $\pi_i=1$. From the double stochastic nature of the matrix, we have $$\pi_j=\sum_{i=0}^M \pi_iP_{ij}=\sum_{i=0}^M P_{ij}=1$$ Hence, $\pi_i=1$ is a valid solution to the first set of equations, and to make it a solution to the second we must normalize it by dividing by $M+1$.

Then by uniqueness as mentioned above, $\pi_j=\dfrac{1}{M+1}$. $$ \blacksquare$$


Note : To understand this proof, one must recall the definition of a stationary distribution.

A vector $\mathbf{\pi}$ is called a stationary distribution vector of a Markov process if the elements of $\mathbf{\pi}$ satisfy: $$ \mathbf{\pi} = \mathbf{\pi} \cdot \mathbf{P}, \sum_{i \in S} \pi_{i} = 1 \text{ , and } \pi_{i} > 0\text{ }\forall \text{ } i \in S $$ Note that a stationary distribution may not exist, and may not be unique.

Related Question