A fair coin is tossed until a head comes up for the first time. The probability of this happening on an odd number toss is? How do I approach this problem?
Solved – A fair coin is tossed until a head comes up for the first time. The probability of this happening on an odd number toss is
probability
Related Solutions
Person 2 seems to be confusing conditional probability of the next toss with the joint probability of the sequence. Denote the results of the $k$th coin toss by event $H_k$ that occurs iff the result is heads (otherwise, $\neg H_k$) Fairness of the coin is mathematically modeled by the following assumptions:
- $\mathbb{P}(H_k) = 0.5$ for all $k$.
- The events $H_1,H_2,\ldots$ are mutually independent.
As Avraham points out in his answer, these are assumptions. Whether these assumptions form a reasonable model of the real coin tossing process is a physical rather than mathematical question.
Person 2
Person 2 is computing the total probability that all 4 tosses are heads, \begin{equation} \mathbb{P}(H_1\cap H_2 \cap H_3 \cap H_4) \end{equation} applying the independence assumption, this equals \begin{equation} =\mathbb{P}(H_1)\mathbb{P}(H_2)\mathbb{P}(H_3)\mathbb{P}(H_4) \end{equation} and applying the first assumption, \begin{equation} = 0.5^4 = 0.0625. \end{equation} So, person 2 correctly computes the probability of obtaining a sequence of 4 heads. Unfortunately, that is not what they want to know.
Person 1
Person 1 understands that the question is about the probability of the next toss coming out heads, given the history of previous tosses \begin{equation} \mathbb{P}(H_4 \mid H_3\cap H_2\cap H_1) \end{equation} but, due to the independence assumption, this conditioning on the other events does not change the probability of $H_4$, thus the condition may simply be omitted: \begin{equation} =\mathbb{P}(H_4) = 0.5. \end{equation}
Thinking about the whole sequence
Person 2 was thinking about the sequence of all 4 tosses, while the question was worded in terms of the 4th toss. However, instead of considering the event of the next toss, person 1 could as well have computed the conditional probability of the whole sequence, conditional on what has already been observed \begin{equation} \mathbb{P}(H_4\cap H_3\cap H_2\cap H_1 \mid H_3 \cap H_2 \cap H_1) \end{equation} Here, one could apply Bayes' theorem or the definition of conditional probability: \begin{equation} =\frac{\mathbb{P}((H_4\cap H_3\cap H_2\cap H_1) \cap (H_3 \cap H_2 \cap H_1 ))}{\mathbb{P}(H_3 \cap H_2 \cap H_1)} = \frac{\mathbb{P}(H_4 \cap H_3 \cap H_2 \cap H_1)}{\mathbb{P}(H_3 \cap H_2 \cap H_1)} \end{equation} based on independence, this equals \begin{equation} =\frac{\mathbb{P}(H_4)\mathbb{P}(H_3)\mathbb{P}(H_2)\mathbb{P}(H_1)}{\mathbb{P}(H_3)\mathbb{P}(H_2)\mathbb{P}(H_1)} = \mathbb{P}(H_4) = 0.5. \end{equation} This second computation is unnecessarily complicated, as the computation 'Person 1' already gave the correct probability of the next toss. However, I think this could help convincing person 2 that one does not need to take the total sequence of 4 heads into account.
This can be answered using the geometric distribution as follows:
The number of failures k - 1 before the first success (heads) with a probability of success p ("heads") is given by:
$$p(X=k)=(1−p)^{k−1}p$$
with k being the total number of tosses including the first 'heads' that terminates the experiment.
And the expected value of X for a given p is $1/p=2$.
The derivation of the expected value can be found here. The last steps left implicit should be as follows:
$\frac{d}{dr} \frac{1}{1-r} = \frac{1}{(1-r)^2}$ to be plugged into the expression:
$E(X) = \frac{p}{1-p} \sum\limits_{x = 1}^{\infty}x\ r^x = \frac{p}{1-p}\ r\ (\frac{d}{dr} \frac{1}{1-r})= \frac{p}{1-p}\ r\ \frac{1}{(1-r)^2}$. With $r = 1 - p$, it simplifies to
$E(X) = \frac{1}{p}$, justifying its use above.]
Alternatively, we could use the negative binomial distribution interpreted as the number of failures before the first success. The probability mass function is given as the p(number of failures, n, before attaining r successes | given a certain probability, p, of success in each Bernoulli trial):
$$p(n;r,p) ={n+r-1\choose r-1} p^r (1-p)^n$$
The expectation for number of trials, n + r is given by the general formula:
$$\frac{r}{(1-p)}$$
Given our known parameters: r = 1 and p = 0.5,
$$E(n + r; 1,0.5) =\frac{r}{1-p} = \frac{1}{1-0.5} = 2$$
Hence we can expect to make two tosses before getting the first head with the the expected number of tails being $E(n+r) - r = 1$.
We can run a Monte Carlo simulation to prove it:
set.seed(1)
p <- 1/2
reps <- 10000 # Total number of simulations.
tosses_to_HEAD <- 0 # Setting up an empty vector to add output to.
for (i in 1:reps) {
head <- 0 # Set the variable 'head' to 0 with every loop.
counter <- 0 # Same forlocal variable 'counter'.
while (head == 0) {
head <- head + rbinom(1, 1, p) # Toss a coin and add to 'head'
counter <- counter + 1 # Add 1 to 'counter'
}
tosses_to_HEAD[i] <- counter # Append number in counter after getting heads.
}
mean(tosses_to_HEAD)
[1] 2.0097
Best Answer
Add up the probabilities of the coin coming up heads for the first time on toss 1, 3, 5...
$p_o = 1/2 + 1/2^3 + 1/2^5 + ...$
The $1/2$ term is pretty obvious, it's the probability of the first toss being heads.
The $1/2^3$ term is the probability of getting heads for the first time on the third toss, or the sequence TTH. That sequence has a probability of $1/2 * 1/2 * 1/2$.
The $1/2^5$ term is the probability of getting heads for the first time on the fifth toss, or the sequence TTTTH. That sequence has a probability of $1/2 * 1/2 * 1/2 * 1/2 * 1/2$.
Now we can rewrite the series above as
$p_o = 1/2 + 1/8 + 1/32 + ...$
This is a geometric series that sums to $2/3$. The easiest way to show this is with a visual example. Start with the series
$p = 1/2 + 1/4 + 1/8 + 1/16 + 1/32 + 1/64 + ...$
This is a geometric series that sums to $1$.
If we sum just the even terms of that series, we can see that they sum to $1/3$.
$1/4 + 1/16 + 1/64 + 1/256 + ... = 1/3$
If you eliminate the even terms from the full sequence, you're left with just the odd terms, which must add up to $2/3$.
$p_o = 1/2 + 1/8 + 1/32 + ... = 2/3$