Probability of seeing HTTH before THTH in coin flips

martingalesprobabilityprobability theorysolution-verificationstopping-times

I'm doing a past paper question and trying to use Doob's Optional Stopping to find the probability that for independent identical $(X_n)$ uniform on $\{0,1\}$ we see the pattern $a = (1,0,0,1)$ before the pattern $b = (0,1,0,1)$, and so wonder whether this is correct, and/or an efficient method since applying Doob's twice doesn't necessarily seem like the quickest way.

We consider stopping times $\tau_w = \text{inf}\{n \in \mathbb N \mid (X_{n-3},\dots,X_n) = w\}$ for $w \in \{a,b\}$, and $\tau = \tau_a \wedge \tau_b$.

As usual we'll set up gamblers entering a casino, all bets double or nothing gambler $A^i$ entering at time $i$ starting with $1$ and betting on the first letter for $a$, then second, etc., gambler $B^i$ starting with $1$ and betting on consecutive letters for $b$.

Then let $M_n = \sum_{i=1}^n A^i_n + B^i_n – 2n$ earnings of $A^i_n$ and $B^i_n$, and $N_n = \sum_{i=1}^n A^i_n + 2 \cdot B^i_n – 3n$ earnings for $A^i_n$ and $2 \cdot B^i_n$. Then it follows that these are martingales as a (infinite) sum of martingales starting at 0 ($A^i_n – 1$ for instance is such a martingale, and is $0$ for $n < i$) since the game is fair.

Then we have that $\tau$ has finite expectation since
$$
\mathbb P(\tau > 4n) \leq \mathbb P\bigg((X_{4m-3},\dots,X_m) \not = a \ \forall m \leq n\bigg) \leq (15/16)^n
$$

and so since both $M_n$ and $N_n$ have increments bounded by $32 \cdot 4 \cdot 2$, we apply Doob's twice:

At $\tau_a$, we have a gambler with 4 correct letters on $a$, one with 1 correct letter on $a$, and one with 2 correct letters on $b$.

At $\tau_b$, we have a gambler with 4 correct letters on $b$, one with 2 correct letters on $b$, and one with 1 correct letter on $a$.

So applying Doob's to both $M_n^\tau$ and $N_n^\tau$, writing $p = \mathbb P(\tau = \tau_a)$, and splitting the first expectation based on whether $\tau = \tau_a$ or $\tau = \tau_b$:
$$
\begin{align*}
0 &= \mathbb E[M_0]= \mathbb E[M_\tau]\\
&= \mathbb E\left[\sum_{i=1}^\tau A^i_n + B^i_n\right] – 2* \cdot \mathbb E[\tau]\\
&= p \cdot (2^4 + 2^1 + 2^2) + q \cdot (2^1+2^4+2^2) – 2 \mathbb E[\tau]\\
&= 2 \cdot (11 – \mathbb E[\tau])
\end{align*}
$$

so that $\mathbb E[\tau] = 11$, and then doing the same again
$$
\begin{align*}
0 &= \mathbb E[N_0]= \mathbb E[N_\tau]\\
&= \mathbb E\left[\sum_{i=1}^\tau A^i_n + 2\cdot B^i_n\right] – 3* \cdot \mathbb E[\tau]\\
&= p \cdot (2^4 + 2^1 + 2\cdot 2^2) + q \cdot (2^1+2\cdot 2^4+2\cdot 2^2) – 3\cdot 11\\
&= 26p +42q – 33\\
&= 42-33-16p
\end{align*}
$$

so that $p = \dfrac{9}{16}$. The denominator of $16$ makes me think maybe there's a much quicker combinatorial/Markov chain way of getting to the same answer? Any thoughts are appreciated!

Best Answer

Well, here is a way based on first step analysis, which essentially breaks down the possibilities resulting from the first step (first transition) in the Markov chain and proceeds step by step.

Calling the starting state before any toss as $s$, probabilities of moving to the next possible state as $H=a, T= b, HT = c, HTT=d,TH=h, THT=k,$
with border probabilities $HTTH=1, THTH = 0$, we get

$\displaylines{s = (1/2)a+(1/2)b\\a = (1/2)c+(1/2)a\\b = (1/2)h+(1/2)b\\c=(1/2)d+(1/2)h\\d=(1/2)h+(1/2)*1\\h=(1/2)k +(1/2)a\\ k = (1/2)d +(1/2)*0}$

Solving, I get $s = \frac9{16}$

Related Question