[Math] expected value of this markov chain

markov chainsprobability

Question:

A bag contains 3 white chips and 3 red chips. You repeatedly draw a chip at random from the bag. If it's white, you set it aside; if it's red, you put it back in the bag. After removing all 3 white chips, you stop. What is the expected number of times you will draw from the bag?

I believe this gives me the markov chain:

[0.5, 0.5, 0, 0]
[0, 0.6, 0.4, 0]
[0, 0, 0.75, 0.25]
[0, 0, 0 , 1]

But how am I supposed to find the expected number of times until getting to state 4?

Best Answer

You have to solve the following recursive relations. Let $h(k)$ be the expected number of steps until your reach the (absorbing) state $4$ when you are in state $k$, for $k=1,2,3,4$. So we have that $$h(4)=0$$ because when you are already in $4$ you need zero steps to reach $4$. Then for $k=3$ $$h(3)=1+0.75h(3)+0.25h(4)$$ because when you are in state 3 you will do one step $(+1)$ and you will reach with probability $0.75$ again state $3$ and with probability $0.25$ state $4$. And you start over (to count the expected number of steps) from the new state, therefore $0.25h(4)$ and $0.75h(3)$. Similarly you can determine $h(2)$ and $h(1)$ and solve the system of equations to determine $h(1)$ which is the expected value of steps to reach state $4$ from the initial state which is $1$ (according to your notation of the states). More precisely: $$h(2)=1+0.6h(2)+0.4h(3)$$ and $$h(1)=1+0.5h(1)+0.5h(2)$$ which gives the following system $$\begin{cases} h(4)=1 \\ h(3)=4+h(4) \\ h(2)= 2.5+h(3) \\ h(1)=2+h(2) \end{cases}$$ which gives $h(4)=1, h(3)=4, h(2)=6.5, h(1)=8.5$. (Please doublecheck the calculations because I am very prone to mistakes in the calculations). So the expected number of steps is 8.5.

Related Question