I can write the code in a couple of minutes. But let's see if you can help me to help you.
1. Generate a vector of 100x1 or 1x100. The data is a random distribution of '1' and '2'. You can use rand(). Or randint() is even better.
2. set your P=zeros(2,2)
3. Go through a for-loop. If the vector value goes from 1 to 1, P(1,1) is increased by 1. If the values goes from 1 to 2, P(1,2) is increased by 1. And so on.
4. At the end, divide P by 100 (your data length), you get your probability matrix.
EDIT This is incorrect. Should divide the first row by the total number of 1s and divide the second row by the total number of 2s.
Start writing your MATLAB code, if you have particular questions regarding using MATLAB functions, or M-script syntax. People here would love to help you.
Best Answer