The Monty Hall problem

This problem, known as the Monty Hall problem, is famous for being so bizarre and counter-intuitive. It is in fact best to switch doors, and this is not hard to prove either. In my opinion, the reason it seems so bizarre the first time one (including me) encounters it is that humans are simply bad at thinking about probability. What follows is essentially how I have justified switching doors to myself over the years.

At the start of the game, you are asked to pick a single door. There is a $1/3$ chance that you have picked correctly, and a $2/3$ chance that you are wrong. This does not change when one of the two doors you did not pick is opened. The second time is that you are choosing between whether your first guess was right (which has probability $1/3$) or wrong (probability $2/3$). Clearly it is more likely that your first guess was wrong, so you switch doors.

This didn't sit well with me when I first heard it. To me, it seemed that the situation of picking between two doors has a certain kind of symmetry-things are either behind one door or the other, with equal probability. Since this is not the case here, I was led to ask where the asymmetry comes from? What causes one door to be more likely to hold the prize than the other? The key is that the host knows which door has the prize, and opens a door that he knows does not have the prize behind it.

To clarify this, say you choose door $A$, and are then asked to choose between doors $A$ and $B$ (no doors have been opened yet). There is no advantage to switching in this situation. Say you are asked to choose between $A$ and $C$; again, there is no advantage in switching. However, what if you are asked to choose between a) the prize behind door $A$ and b) the better of the two prizes behind door $B$ and $C$. Clearly, in this case it is in your advantage to switch. But this is exactly the same problem as the one you've been confronted with! Why? Precisely because the host always opens (hence gets rid of) the door that you did not pick which has the worse prize behind it. This is what I mean when I say that the asymmetry in the situation comes from the knowledge of the host.


To understand why your odds increase by changing door, let us take an extreme example first. Say there are $10000$ doors. Behind one of them is a car and behind the rest are donkeys. Now, the odds of choosing a car is $1\over10000$ and the odds of choosing a donkey are $9999\over10000$. Say you pick a random door, which we call X for now. According to the rules of the game, the game show host now opens all the doors except for two, one of which contains the car. You now have the option to switch. Since the probability for not choosing the car initially was $9999\over10000$ it is very likely you didn't choose the car. So assuming now that door X is a goat and you switch you get the car. This means that as long as you pick the goat on your first try you will always get the car.

If we return to the original problem where there are only 3 doors, we see that the exact same logic applies. The probability that you choose a goat on your first try is $2\over3$, while choosing a car is $1\over3$. If you choose a goat on your first try and switch you will get a car and if you choose the car on your first try and switch, you will get a goat. Thus, the probability that you will get a car, if you switch is $2\over3$ (which is more than the initial $1\over3$).


Here is an equivalent formulation of the Monty Hall problem, which folks on twitter seem to like.

If you solve this problem, you have solved the original. To see the connection, you can imagine boxers hidden behind the doors.

There are three boxers. Two of the boxers are evenly matched (i.e. 50-50, no draws!); the other boxer will beat either them, always.

You blindly guess that Boxer A is the best and let the other two fight.

Boxer B beats Boxer C.

Do you want to stick with Boxer A in a match-up with Boxer B, or do you want to switch?

Intuitively witnessing Boxer B beating Boxer C gives you information about Boxer B. The advantage of switching no longer feels "paradoxical."

Details: If Boxer B is best, then the probability of witnessing Boxer B beat Boxer C is equal to 1. By contrast, if the initially chosen Boxer A is best, then this probability is equal to 1/2 (B&C are evenly matched with no draws). This means that the event you witnessed is twice as likely if Boxer B were best than if Boxer A were best (i.e. 1 vs. 1/2). This is the information you have received. Bayes rule now tells us that the initial 1-to-1 odds in favor of Boxer B in a match-up against Boxer A must double, i.e. there are now 2-to-1 odds in favor of Boxer B being the best boxer. This means there are now 2 chances in favor of Boxer B for every 1 chance in favor of Boxer A, i.e. 2 chances out of 3 total chances. This is equivalent to saying that the probability is now 2/3 that Boxer B is the best boxer. You should switch.

Notes:

This formulation has the advantage of providing clear intuition for the solution to the problem as presented, which is equivalent to the famous version originally presented by Marilyn vos Savant.

By contrast, the current top response since 2012 on math.stackexchange correctly solves a different problem, in particular the prior probability of winning if you commit to switching regardless of which door Monty opens. What that response is missing is an argument connecting the posterior (conditional) probability to the prior. For a justification, see my comment there, or you can read this paper, page 151--153.

Update: I googled and it appears that the first to point out a framing like this were Burns & Wieth, Journal of Experimental Psychology--General 2004.