[Math] Deal or No Deal: Monty Hall

game theorymonty-hallprobability

This question was inspired by another question posted today: Monty Hall Problem Extended.

So I thought that the comments an answers brought up a great point about increasing the doors to 100 or something much larger, and using that as a way to help visualize why switching is always the best choice when trying to explain the problem to others.

And then I was thinking about the game show, Deal or No Deal. For those unfamiliar with Deal or No Deal: there are 26 cases, each containing amounts of money ranging from \$0.01 to one million dollars. You choose one case, and it's "yours" and out-of-play (this is analogous to choosing the first door in the Monty Hall problem). Throughout the game you open 24 of the remaining cases, and you see how much money was in each case.

In the end, you are left with 2 cases: "your" case, that you chose in the beginning, and the only other case you didn't open. This is where it becomes Monty Hall: you can either choose to keep your case, or switch cases and get the other one.

So what I'm wondering is, does the Monty Hall logic of "always switch doors/cases" apply here? The differences:

1) It's not a case of there being simply 1 car and a bunch of goats. All the money values are different in each case. You aren't always going to end up with a choice between a million dollars or something small… The two remaining cases might end up being \$10,000 and \$250,000. Or it might be \$10 and a million dollars. Or \$10 and $100.

2) I think part of what makes Monty Hall work is that the car always remains in play. Your first choice is a 1/26 probability of selecting the car/million dollar case. But in Deal or No Deal, the car/million dollar case can be eliminated partway through the game. So I'm thinking that probably changes things.

My first vague thoughts are… If you make it to the end and the million dollar case still is in play, Monty Hall applies and you should switch cases. Because it's the same idea; I had a 1/26 shot at the million. 24 have been eliminated. It's much more likely that the other case has the million.

But if the million is eliminated while you're playing, what then? Can Monty Hall not help us, because you can't compare the probability of selecting the million dollar case because now it's zero? I'm trying to think of a way to figure out whether or not you should switch, in an attempt to get the case with the most money in it. We know that \$1,000,000 is no longer available. But is there anything we can do to decide which case is likely to be more valuable? Or is this outside Monty Hall's bounds?

Best Answer

The key is: Monty knows where the car is (and will never open that door). We don't know where the million dollar is so we MIGHT open that door. For an illustration, we look at how the tree diagram differs for the two cases.

Suppose we have 3 doors, A, B and C and our car/million is in door A. We further assume we will always switch. (Once we understand this, we can extend it to $n$ doors and see that the situation will be similar.)

Case 1: Monty Hall Problem Monty Hall Tree Diagram

If we switch, $P($Win$) = \frac{2}{3}$.

Case 2: Deal or No Deal scenario Deal or no deal tree diagram

Notice our assumption in the question is we only look at the situation if the million has not been opened. So we are in essence calculating a conditional probability. If we switch,

$P($Win $|$ Million not opened$) = \displaystyle \frac{P(\textrm{Win}\cap \textrm{Million not opened})}{P(\textrm{Million not opened})} = \frac{\frac{1}{6}+\frac{1}{6}}{\frac{1}{6}+\frac{1}{6}+\frac{1}{6}+\frac{1}{6}}=\frac{1}{2}$.

Related Question