Solved – How do we know that the probability of rolling 1 and 2 is 1/18

diceprobability

Since my first probability class I have been wondering about the following.

Calculating probabilities is usually introduced via the ratio of the "favored events" to the total possible events. In the case of rolling two 6-sided dice, the amount of possible events is $36$, as displayed in the table below.

\begin{array} {|c|c|c|c|c|c|c|}
\hline
&1 & 2 & 3 & 4 & 5 & 6 \\
\hline
1 & (1,1) & (1,2) & (1,3) & (1,4) & (1,5) & (1,6) \\
\hline
2 & (2,1) & (2,2) & (2,3) & (2,4) & (2,5) & (2,6) \\
\hline
3 & (3,1) & (3,2) & (3,3) & (3,4) & (3,5) & (3,6) \\
\hline
4 & (4,1) & (4,2) & (4,3) & (4,4) & (4,5) & (4,6) \\
\hline
5 & (5,1) & (5,2) & (5,3) & (5,4) & (5,5) & (5,6) \\
\hline
6 & (6,1) & (6,2) & (6,3) & (6,4) & (6,5) & (6,6) \\
\hline
\end{array}

If we therefore were interested in calculating the probability of the event A "rolling a $1$ and a $2$", we would see that there are two "favored events" and calculate the probability of the event as $\frac{2}{36}=\frac{1}{18}$.

Now, what always made me wonder is: Let's say it would be impossible to distinguish between the two dice and we would only observe them after they were rolled, so for example we would observe "Somebody gives me a box. I open the box. There is a $1$ and a $2$". In this hypothetical scenario we would not be able to distinguish between the two dice, so we would not know that there are two possible events leading to this observation. Then our possible events would like that:

\begin{array} {|c|c|c|c|c|c|}
\hline
(1,1) & (1,2) & (1,3) & (1,4) & (1,5) & (1,6) \\
\hline
& (2,2) & (2,3) & (2,4) & (2,5) & (2,6) \\
\hline
& & (3,3) & (3,4) & (3,5) & (3,6) \\
\hline
& & & (4,4) & (4,5) & (4,6) \\
\hline
& & & & (5,5) & (5,6) \\
\hline
& & & & & (6,6) \\
\hline
\end{array}

and we would calculate the probability of event A as $\frac{1}{21}$.

Again, I am fully aware of the fact that the first approach will lead us to the correct answer. The question I am asking myself is:

How do we know that $\frac{1}{18}$ is correct?

The two answers I have come up with are:

  • We can empirically check it. As much as I am interested in this, I need to admit that I haven't done this myself. But I believe it would be the case.
  • In reality we can distinguish between the dice, like one is black and the other one blue, or throw one before the other or simply know about the $36$ possible events and then all the standard theory works.

My questions to you are:

  • What other reasons are there for us to know that $\frac{1}{18}$ is correct? (I am pretty sure there must be a few (at least technical) reasons and this is why I posted this question)
  • Is there some basic argument against assuming that we cannot distinguish between the dice at all?
  • If we assume that we cannot distinguish between the dice and have no way to check the probability empirically, is $P(A) = \frac{1}{21}$ even correct or did I overlook something?

Thank you for taking your time to read my question and I hope it is specific enough.

Best Answer

Imagine that you threw your fair six-sided die and you got ⚀. The result was so fascinating that you called your friend Dave and told him about it. Since he was curious what he'd get when throwing his fair six-sided die, he threw it and got ⚁.

A standard die has six sides. If you are not cheating then it lands on each side with equal probability, i.e. $1$ in $6$ times. The probability that you throw ⚀, the same as with the other sides, is $\tfrac{1}{6}$. The probability that you throw ⚀, and your friend throws ⚁, is $\tfrac{1}{6} \times \tfrac{1}{6} = \tfrac{1}{36}$ since the two events are independent and we multiply independent probabilities. Saying it differently, there are $36$ arrangements of such pairs that can be easily listed (as you already did). The probability of the opposite event (you throw ⚁ and your friend throws ⚀) is also $\tfrac{1}{36}$. The probabilities that you throw ⚀, and your friend throws ⚁, or that you throw ⚁, and your friend throws ⚀, are exclusive, so we add them $\tfrac{1}{36} + \tfrac{1}{36} = \tfrac{2}{36}$. Among all the possible arrangements, there are two meeting this condition.

How do we know all of this? Well, on the grounds of probability, combinatorics and logic, but those three need some factual knowledge to rely on. We know on the basis of the experience of thousands of gamblers and some physics, that there is no reason to believe that a fair six-sided die has other than an equiprobable chance of landing on each side. Similarly, we have no reason to suspect that two independent throws are somehow related and influence each other.

You can imagine a box with tickets labeled using all the $2$-combinations (with repetition) of numbers from $1$ to $6$. That would limit the number of possible outcomes to $21$ and change the probabilities. However if you think of such a definition in term of dice, then you would have to imagine two dice that are somehow glued together. This is something very different than two dice that can function independently and can be thrown alone landing on each side with equal probability without affecting each other.

All that said, one needs to comment that such models are possible, but not for things like dice. For example, in particle physics based on empirical observations it appeared that Bose-Einstein statistic of non-distinguishable particles (see also the stars-and-bars problem) is more appropriate than the distinguishable-particles model. You can find some remarks about those models in Probability or Probability via Expectation by Peter Whittle, or in volume one of An introduction to probability theory and its applications by William Feller.

Related Question