[Math] Help showing aMarkov chain with a doubly-stochastic matrix has uniform limiting distribution

markov chainsprobability

I have a lot of difficulty with proofs; could someone help me with this question that really can not solve? I would also like some indication of material to get through with this kind of question and some hint of material about Markov chain. Thanks in advance.

"A stochastic matrix is called doubly stochastic if its columns sum to 1. Let
$X_0
, X_1, \dots$ be a Markov chain on $\{1,\dots, k\}$ with a doubly stochastic transition
matrix and initial distribution that is uniform on $\{1, \dots, k\}.$ Show that the distribution of $X_n$ is uniform on $\{1.\dots, k\},$ for all $n \ge 0."$

Best Answer

$X_1=X_0P$, where $P$ is the transition matrix. As $X_0=[1/k,\ldots,1/k]$, one would have $X_1^i=\frac{1}{k}(P_{2i}+P_{1i}+\ldots+P_{ki})=\frac{1}{k}$ by the double stochastic property (sum of the entries on colums are $1$). The general result follows by induction.