Waiting Time – How to Calculate Waiting Time for Successive Occurrences When Rolling a Die

diceexpected valueprobability

In consecutive throws of an ordinary die, which of the following two possibilities is more likely to happen first:

a) Two successive occurrences of 5 or

b) Three successive appearances of numbers divisible by 3?

I thought that "time to event" random variable follows a geometric distribution. In the first case the probability of the event would be $p_1=1/36$ so the waiting time would be $E(X_1)=\frac{1-p_1}{p_1}=35$, plus 2 (for the successive occurrences of 5) = 37.

Accordingly the probability of the second event would be $p_2=\frac{2^3}{6^3}$ and the waiting time for the second event would be $E(X_2)=\frac{1-p_2}{p_2}=26$, plus 3 = 29.

As I was not sure about the validity of the above considerations, I decided to try some monte carlo simulations (the code can be found here: http://ideone.com/TbLdDe). According to the results, the second event will happen first, indeed. But the expected values are larger than the ones calculated before (About 42 and 39, accordingly)

What is the right way to calculate the waiting time?

Best Answer

Generalizing the problem, let $X$ be the random variable for the number of trials it takes to achieve $r$ successes in a row, with $p=$ the probability of success.

The probability of having no successes in a row at time $t$, for $t\ge 1$, is the probability of a failure at time $t$ preceeded by a sequence to time $t-1$ that has not yet had $r$ successes, which is: $$(1-p)(1-P(X\le t-1))$$

The probability of having $r$ successes at time $t+r$ following no successes at time $t$ is: \begin{align} P(X=t+r) &= p^r(1-p)(1-P(X\le t-1)) \\ P(X=t+r+1) &= p^r(1-p)(1-P(X\le t)) \end{align} Then taking sums from $t=0$ to $\infty$: \begin{align} \sum_{t=0}^\infty P(X=t+r+1) &= \sum_{t=0}^\infty p^r(1-p)(1-P(X\le t))\\ \sum_{t=0}^\infty P(X=t) - \sum_{t=0}^r P(X=t) &= p^r(1-p)\sum_{t=0}^\infty (1-P(X\le t))\\ \end{align}

Finally using the facts that:

  • the sum of a probability distribution is 1: $\sum_{t=0}^\infty P(X=t) = 1$
  • the probabilities of seeing $r$ successes in the first $r$ trials are trivially $P(X=t)=0$ for $t<r$ and $P(X=r)=p^r$
  • the expectation of a random variable taking non-negative values can be written in terms of its cumulative distribution function: $E[X] = \sum_{x=0}^\infty (1-F(x))$

we have

$$E(X) = \frac1{p^r(1-p)}(1-p^r)$$

For $r=1$, as expected this simplifies to the same formula as for the geometric distribution: $E(X) = 1/p$.

The cases

  • $p=1/6, r=2 \implies E(X)=6^3/5\times(1-1/6^2)=42$, and
  • $p=1/3,r=3 \implies E(X)= 3^4/2\times(1-1/3^3)=39$

confirm your simulations.

Related Question