[Math] Coin-flipping experiment: the expected number of flips that land on heads

probabilityprobability theory

This question is from Sheldon M. Ross: Introduction to Probability Models which is about finding the expectation by conditioning.

Question:

A coin, having probability $p$ of landing heads, is continually
flipped until at least one head and one tail have been flipped. Let
$N$ denote the number of flips that land on heads at the end. Find $E[N]$.

I condition on the first result of flipping the coin. Then, I get $$E[N] = E[N | \text{first one is a head} ]p + E[N | \text{first one is a tail}](1-p)$$ I believe it should be fine. But then I find my work goes crazy(!) as it proves $p=0$:

  • $E[N | \text{first one is a head}]=1+E[N]$ as if I neglect the first flip, the mean of the remaining flips should be the same as before
  • $E[N | \text{first one is a tail}]=E[N]$ as no head is shown, we are back to the squared one.

I know that I am wrong, but why? Please kindly point out my mistakes and leave me some hints for this question. Thanks in advance.

Best Answer

The sequence of tosses will result in either $H \ldots HT$ or $T \ldots TH,$ where $H \ldots H$ and $T \ldots T$ are (respectively) a sequence of one or more heads and a sequence of one or more tails.

You can compute the conditional expectation directly (by a weighted sum of an infinite number of possible outcomes), or you can use a method like yours to compute the expected number of heads under the condition that you flipped a sequence of zero or more heads followed by a tail, and use that result to evaluate $E[N | \text{first one is a head} ]p + E[N | \text{first one is a tail}](1-p).$

More explicitly, if $E[N']$ is the expected number of heads in a sequence of flips that ends at the first flip that comes up tails, then $$\begin{eqnarray} E[N | \text{first one is a head} ] &=& 1 + E[N'], \\ E[N | \text{first one is a tail}] &=& 1. \end{eqnarray}$$

Edit: For the first equation, if my first flip is heads then the expected total number of heads is the one I just flipped plus the expected additional heads I will flip if I keep flipping until I get tails. For the second equation, consider the number of $H$s in the sequence $T \ldots TH$ where $T \ldots T$ is a sequence of zero or more $T$s.

Edit: The value $E[N']$ is actually easier to obtain than it might appear from the preceding description. If we say that a coin toss is a "success" if it comes up tails, then $N'$ is the number of failures before the first success, and for a sequence of i.i.d. Bernoulli trials (such as our coin tosses) this has a well-known distribution, the geometric distribution, which has a well-known expected value. Since the probability of "success" is $q = 1 - p$ in each trial, we have $E[N'] = \frac 1q = \frac{1}{1-p}.$

Related Question