Modeling conditional probability

conditional probabilityprobability

Problem:

We are given three coins: one has heads in both faces, the second has tails in both faces, and the third has a head in one face and a tail in the other. We choose a coin at random, toss it, and the result is heads. What is the probability that the opposite face is tails?

Solution:

If $p = P(\text{Two headed coin was chosen | Heads came up}) = \displaystyle{\frac{\frac 13}{\frac 12} = \frac 23}$, then the probability that the opposite face is tails is $1 − p = \displaystyle{\frac 13}$.

New to probability, would like to see how this solution works in more detail.

My questions:

1.The sample space is $\{HH, HT, TT\}$. The condition that heads came up refers to the elements $\{HH, HT\}$ and the fact that two headed coin was chosen refers to $\{HH\}$. We want the probability of $\{HH, HT\} \cap \{HH\} = \{HH\}$. The probability of choosing $HH$ is $\displaystyle{\frac 13}$ by Uniform Probability Law. Is this correct?

2. When we calculate that $P(\text{Heads came up}) = \displaystyle{\frac 12}$, what's the sample space? Is it $\{H, T\}$ or $\{HH, HT, TT\}$? Thanks.

edit:

I think I finally understand how to model the sample space here.

Coin toss is a sequential experiment and so the order matters. For example, $HT$ is different from $TH$.

Our sample space is $\{H_1H_2, H_2H_1, T_1T_2, T_2T_1, TH, HT\}$.

Thus

$P(\text{two headed coin was selected $\cap$ heads came up(in the first toss)}) = P(\{H_1H_2, H_2H_1\}) = \frac 26 = \frac 13$

by the uniform probability law.

$P(\text{heads came up(in the first toss)}) = P(\{H_1H_2, H_2H_1, HT\}) = \frac 36 = \frac 12$ by the uniform probability law.

Best Answer

For your question 1 you can apply Bayes' theorem by noting that the event you are after is the same as having chosen the coin with both a H and T face, given that you tossed a H:

$$P(\text{coin is HT}|\text{toss H}) = \frac{P(\text{toss H} | \text{coin is HT}) \, P(\text{coin is HT})}{P(\text{toss H})}$$ and plugging in the appropriate values.

For your question 2 (i.e. the denominator in the above) you can use the Law of Total Probability (LOTP) to partition up the sample space:

$$P(\text{toss H}) = P(\text{toss H} | \text{coin is HH})P(\text{coin is HH}) + P(\text{toss H} | \text{coin is HT})P(\text{coin is HT}) + P(\text{toss H} | \text{coin is TT})P(\text{coin is TT})$$

and evaluating the probabilities.

Related Question