[Math] Probability of a coin tossed using Markov Chains

markov chainsmarkov-processprobabilitystochastic-processestransition matrix

A coin is tossed repeatedly until two successive heads appear.I have to find the mean number of tossed required using a Markov Chain and it's transition matrix.

Here's my attempt:

Let X be the random variable that represents the number of tossed required.So we need to find E(X), where E is the mean.

Otherwise,let $\ X_n$ be the cumulative number of successive heads.The state space is o,1,2 and the transition probability matrix is$$
\begin{pmatrix}
1/2 & 1/2 & 0 \\
1/2 & 0 & 1/2 \\
0 & 0 & 2 \\
\end{pmatrix}
$$
and then I don't know how to continue, I mean how to relate both things
🙁

Best Answer

We have:

$E_0 = 1 + \frac{1}{2}E_0 + \frac{1}{2}E_1$

$E_1 = 1 + \frac{1}{2}E_0 + \frac{1}{2}E_2$

$E_2 = 0$

This simplifies to $E_0 = 6$, which tells you the expected number of flips at the start of the game. We also have $E_1 = 4$ (the expected number of steps when you have already flipped a head).

If you want to use the Markov chain instead:

$N = (I - Q)^{-1} = \left(\begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} - \begin{bmatrix} 1/2 & 1/2 \\ 1/2 & 0 \end{bmatrix}\right)^{-1} = \begin{bmatrix} 1/2 & -1/2 \\ -1/2 & 1 \end{bmatrix}^{-1} = \begin{bmatrix} 4 & 2 \\ 2 & 2 \end{bmatrix} $

Then the expected number of steps for each transient state is:

$t = N1 = \begin{bmatrix} 4 & 2 \\ 2 & 2 \end{bmatrix} \cdot \begin{bmatrix} 1 \\ 1 \end{bmatrix} = \begin{bmatrix} 6 \\ 4 \end{bmatrix}$