MATLAB: Transition probability matrix for markov

transition probability

how to solve if the summation of each row in transition probability matrix in markov chain not equal to one?

Best Answer

If you just want to make each row sum to one, then you can try this
M % matrix
M_new = M./sum(M,2)
I am not sure if this is the theoretically correct way to solve this problem.