Probability of choosing an unfair coin

bayes-theorembayesianprobability

Problem statement: There are 1000 coins. 999 coins have heads and tails. 1 coin is unfair and both sides are heads. You choose a coin at random and toss it 10 times. You get 10 heads. What is the probability that the coin you chose is the unfair coin?

Okay, so first I'd like to say that yes, this "seems" like a Bayes rule problem, and the solution uses the Bayes rule.

Solution using Bayes Rule:

Let $E_1$ denote the event of choosing the unfair coin. Clearly, $P(E_1) = \frac{1}{1000}$. We are given that we have 10 heads.
\begin{align}
P(E_1|10H) = \frac{P(10H|E_1) * P(E_1)}{P(10H)} \\
P(10H|E_1 = 1 \\
P(E_1) = \frac{1}{1000} \\
P(10H) = P(10H|E_1)P(E_1) + P(10H|E_1^c)P(E_1^c) = 1*\frac{1}{1000} + \frac{1}{2^{10}}\frac{999}{1000} \\
\therefore P(E_1|10H) = \frac{\frac{1}{1000}}{\frac{1}{1000} + \frac{1}{1024}\frac{999}{1000}} \approx 0.5
\end{align}

But I don't know why, in this particular situation, knowing that we got 10 heads in a row is relevant information. The probability that we randomly select an unfair coin is $\frac{1}{1000}$, and I feel like this should be independent of knowing whether or not this random coin resulted in 10 heads in a row.

To me getting 10 heads in a row is an after-effect of already choosing a coin at random. So it doesn't really make sense to me that we are using this information to compute the probability that we selected the random coin. Could someone explain? Maybe I just have a weak understanding of bayesian inference?

Best Answer

That’s essentially how Bayes’ Theorem works, we try to calculate the probability of a certain event happening in the past based on what we know happened in the future. You can’t say that the probability of choosing the unfair coin is $\frac{1}{1000}$ because a fair coin would be very unlikely to show $10$ heads in a row, so that piece of information really changes the probability.

Related Question