The Convergence of regular Markov Chain

markov chainsstochastic-processes

I took the Stochastic Processes course when I was an undergraduate. I am now help myself recall what I have learnt before by reviewing the textbook. I have a question on the Markov chain and its coverngence properties.

For a Markov chain, $x_{t+1} = P x_{t}$ (where P is the transition matrix and x takes the form of column vector). If it is rugular, that is, P is primitive, $P_{\infty}$ is a matrix where each row is exactly the same (since the probability distribution of the state at $t \to \infty$ is not dependent on the initial state). Actually, this row vector is the eigenvector with eigenvalue 1, denoted as v.

Then, $x_{t} = P^{t} x_{0}$ turns to be $x_{\infty} = P_{\infty} x_{0}$ when t $\to \infty$. In this form, $x_{\infty}(i) = v x_{0}$. It depends on the initial distribution and is the corresponds to the result of standard solution, that is, $x_{\infty} = P x_{\infty}$.

I know it is absurd but I have not found out what is wrong.

Best Answer

I have found out the error and the question can be closed.

The state vector x should be a row vector left-multiply the transition matrix P, rather than a column vector right-multiply the matrix. In this way, since every row of $P_{\infty}$ is exactly the same, x is indifferent to the multiplication.