The Monty Hall Problem
A space for learning, creating, and cutting tough topcis down to size
Read the full article with interactive games and visualizations at: jameshorsley.me/articles/monty-hall-problem
TL;DR
The Monty Hall Problem is a math puzzle framed through a game-show. In the game you are shown three doors and have to guess which door has a prize hidden behind it. After you make your first guess (e.g. "I choose Door 1") the game-show host, named Monty, opens one of the other doors that does not contain the prize. Monty then asks if you want to keep your first door choice or switch to the only other remaining door. Should you switch?
When the question was first posed, there were two primary opinions: 1. It doesn't matter - there are only two closed doors left, so it's a 50/50 choice either way. 2. Always switch - the host gave you useful information by opening the door that makes switching a 2/3 chance of winning.
The correct answer is (2) you should always switch as it gives you the best chance of winning.
The rest of this article covers my exploration into why this is true through intuitive, empirical, and mathematical explanations.
Background
I recently learned about the Monty Hall Problem1, which is a probability puzzle that's based on a long-running US game show called Lets Make a Deal2, hosted by Monty Hall3. Like many popular game shows, the problem taps into deep aspects of risk and probability that we can relate to intuitions from our everyday lives. I became captivated by the problem and, as is often the case, this resulted in me excitedly trying to explain it to my family. In doing so, I realized that while I could explain the standard setup of the problem, I didn't understand it deeply enough to explain it without getting "handwavy." Following Richard Feynman's guidance that "if you can't explain something in simple terms, you don't understand it", this post is my attempt to explore and explain the Monty Hall Problem, in public, with the hopes of internalizing what it really means. Of note, my family were asking me great questions about how it relates to other probabilistic choice situations, such as with multiple choice quiz questions, which are interesting to explore.
A Problem in Probability (Original Problem)
The Monty Hall problem was originally put forward in Letters to the Editor of The American Statistician4 with the following entertaining description that directly riffs off the actual TV game show:
Game Show Script
Monty: One of the three boxes labeled A, B, and C contains the keys to that new 1975 Lincoln Continental. The other two are empty. If you choose the box containing the keys, you win the car.
Contestant: Gasp!
Monty: Select one of these boxes
Contestant: I'll take box B
Monty: Now box A and box C are on the table and here is box B (contestant grips box B tightly). It is possible the car keys are in that box! I'll give you $100 for the box.
Contestant: No, thank you.
Monty: How about $200
Contestant: No!
Audience: No!!
Monty: Remember that the probability of your box containing the keys to the car is 1/3 and the probability of your box being empty is 2/3. I'll give you $500
Audience: No!!
Contestant: No, I think I'll keep this box.
Monty: I'll do you a favor and open one of the remaining boxes on the table (he opens box A). It's empty! (Audience applause). Now either box C or your box B contains the car keys. Since there are two boxes left, the probability of your box containing the keys is now 1/2. I'll give you $1000 cash for your box.
Contestant: I'll trade you my box B for the box C on the table
Monty: That's weird!!!
Source: Letters to the Editor. (1975). The American Statistician
The letter then asked, "But is Monty right?" Monty claims the probability changed from a 1/3 chance to a 1/2 after Monty opened Box A. Which means the contestant's chance of winning is the same whether they keep Box B or switch to Box C.
Ask Marilyn (Simplified Problem)
A simpler statement of the game came in 1990 when a reader submitted a question to Marilyn vos Savant's Ask Marilyn column in Parade magazine5. It was this simpler version that made the problem famous as other readers reacted to Marilyn's answer, which was akin to going viral in the 1990s. We'll focus on the Ask Marilyn framing in the article as it's easier to analyze, given it removes the game show "trading" aspects, and focuses on the core question of probability. For a little background, the Ask Marilyn column has been running since 1986 and Marilyn is known to have had the highest recorded IQ (228) in the Guinness Book of Records until they removed that category from the book.
Simplified Monty Hall Problem
Suppose you're on a game show, and you're given the choice of three doors: - Behind one door is a car; behind the others, goats. - You pick a door, say #1, and the host, who knows what's behind the doors, opens another door, say #3, which has a goat. - He then says to you, "Do you want to pick door #2?" Is it to your advantage to switch your choice?
Marilyn's (Correct) Answer
Yes; you should switch. The first door has a 1/3 chance of winning, but the second door has a 2/3 chance.
Source: Ask Marilyn: Game Show Problem
Mathematical Controversy
Marilyn's answer turned out to be controversial among the mathematical community. After publishing, she received a deluge of mail from readers telling her how wrong she was (specifically snail mail since this was in the early 90's). The overwhelming consensus among the complaints was that, because there were only two doors left, it was a 50/50 decision between the two doors. The vehement, overconfident, and frankly offensive tone in some reader replies is worth noting from a psychological perspective, as it highlights their interest and the deep way it seemed to challenge people's intuitions and world view.
Ask Marilyn: Reader Replies
"You blew it, and you blew it big! Since you seem to have difficulty grasping the basic principle at work here, I'll explain. After the host reveals a goat, you now have a one-in-two chance of being correct. Whether you change your selection or not, the odds are the same. There is enough mathematical illiteracy in this country, and we don't need the world's highest IQ propagating more. Shame!"
"Maybe women look at math problems differently than men."
"You are the goat!"
Source: Ask Marilyn: Game Show Problem
Why You Should Always Switch
Marilyn's answer made sense to me, but I didn't feel like I had an intuitive understanding of why it was correct, particularly in ways that would help me apply it in my everyday life. I've long found that working through different solutions and explanations for a single problem helps me build a mental model that sticks with me over time. For concepts that don't easily stick in my brain, I've often found there'll eventually be one specific explanation that finally makes the concept "click."
A memorable example of this for me was with bloom filters6, a technique used in software engineering to check if an identifier is already present in a list (technically a set). I'd read up on bloom filters enough to know how to use them, but, no matter how many times I read about how and why they work, it was always "in one ear and out the other." That was until I finally had a eureka moment after reading one particular description that made bloom filters stick. As such, to help understand the Monty Hall Problem better, I'll run through four different explanations of why you should always switch. Some explanations are intuition-based and others are more mathematical breakdowns.
Explanation #1 - A Million Doors
"Suppose there are a million doors, and you pick door #1. Then the host, who knows what's behind the doors and will always avoid the one with the prize, opens them all except door #777,777. You'd switch to that door pretty fast, wouldn't you?"
Source: Ask Marilyn: Game Show Problem
Monty's actions aren't random. Monty knows where the prize is and this affects his actions. When Monty acts he is leaking information out to you in useful ways. The million doors explanation emphasizes that, the more doors Monty opens, the more glaring it is there's one remaining door that Monty decided not to open. At the start of the game you don't know anything about where the prize might be. This means that the first door you choose is literally a one-in-a-million guess. While this isn't as bad as your odds of winning the jackpot in the mega-millions lottery (290,472,336 to 1), it's still not something you'd want to bet your life savings on. In other words, you shouldn't have much confidence in your first choice.
Monty then opens 999,998 of the remaining doors that don't contain a prize, guided by his knowledge of where the prize is. When he asks if you want to switch, you're faced with the possibility that either:
- you picked correctly the first time (a one in a million chance), or..
- the prize was in one of the other 999,999 doors and Monty has condensed the odds of those 999,999 doors down into one single option.
In this context it seems obvious you're more likely to win by switching. The same principle is at play with the three door example, it's just that the million door example makes it more intuitively obvious.
Try the interactive "Million Doors" game →
Explanation #2 - Shell Game
"The winning odds of 1/3 on the first choice can't go up to 1/2 just because the host opens a losing door. To illustrate this, let's say we play a shell game. You look away, and I put a pea under one of three shells. Then I ask you to put your finger on a shell. The odds that your choice contains a pea are 1/3, agreed? Then I simply lift up an empty shell from the remaining other two. As I can (and will) do this regardless of what you've chosen, we've learned nothing to allow us to revise the odds on the shell under your finger."
Source: Ask Marilyn: Game Show Problem
While the million doors explanation focuses on increased odds of the one door that Monty leaves closed, the shell game explanation focuses on why the probability of your first choice doesn't change. The intuition, particularly the framing of placing your finger on the shell, seems to focus on how Monty's actions can't have changed the probability of your original choice. This makes intuitive sense as you didn't learn anything that should revise your original probability assessment of the shell you chose. However, I didn't find this explanation as powerful of a general intuition as it relies on knowing when the odds have changed with your original choice, which isn't always correct.
For example, imagine if Monty instead opens both of the other doors, showing that neither contain the prize, Monty then asks if you want to keep the door you originally chose or take a cash prize of $100. In this scenario, you have clearly learned something that changes the odds your chosen door contains the prize. Assuming Monty's not cheating, you now have a 100% chance of winning by keeping your original choice.
This new scenario doesn't change Marilyn's core point with the shell game example. But, it does highlight that you can't blindly rely on the intuition that if nothing physically changes with your original choice, the surrounding odds remain unchanged.
Explanation #3 - Experimental Evidence
We can also take a scientific, experimental approach. Our hypothesis is that switching provides a 2/3 chance of winning, so we can run experiments to see if the resulting data supports that assumption. Marilyn took this very approach and suggested her readers crowdsource the experimental data:
I'm willing to put my thinking to the test with a nationwide experiment. This is a call to math classes all across the country.
(...)
One student plays the contestant, and another, the host. Label three paper cups #1, #2, and #3. While the contestant looks away, the host randomly hides a penny under a cup by throwing a die until a 1, 2, or 3 comes up. Next, the contestant randomly points to a cup by throwing a die the same way. Then the host purposely lifts up a losing cup from the two unchosen. Lastly, the contestant "stays" and lifts up his original cup to see if it covers the penny. Play "not switching" two hundred times and keep track of how often the contestant wins. (...) Play "switching" two hundred times, also.
(...)
Wow! What a response we received! (...) We've received thousands of letters, and of the people who performed the experiment by hand as described, the results are close to unanimous: you win twice as often when you change doors. Nearly 100% of those readers now believe it pays to switch.
Source: Ask Marilyn: Game Show Problem
Marilyn's readers also ran computer simulations that verified the 2/3 to be correct. But, both for the fun of it and to see the results myself, I created an alternate version of the Monty Hall game that you can play below, called Egg Hunt.
In Egg Hunt there are three cards, and you have to guess which one of the cards has an easter egg on the other side. The other two losing cards have a shoechicken on the back, which is something I doodled years back and thought was an interesting alternative to Monty's goats. You win if you choose the egg card, you lose if you choose a shoechicken card. Egg Hunt has an autoplay mode to help rack up the number of games and show that the chance of winning really does converge at around a 66% win rate when switching and a 33% win rate when keeping.
Try the interactive "Egg Hunt" game →
Explanation #4 - Use Math
Another approach is to use math to prove that switching gives you a 2/3 chance of winning. I like working through formal math proofs as they help me build confidence in the intuitions I've formed. I've also found that proofs help me build a mental model about the problem in ways that make it easier to explain the problem to others.
Bayesian Inference
A great way to show why switching gives you a 2/3 chance of winning is through Bayesian inference12. Bayesian inference uses a math equation called Bayes' Theorem13 to help you estimate how likely something you believe is correct. Bayes' theorem is named after an English minister from the 18th century named Thomas Bayes14 who was also an active mathematician and philosopher. Bayes' most famous work was published after he died but has gone on to become one of the most influential theorems in the information age. Bayes' theorem provides deep intuitions about how to think about probability, particularly as it relates to making decisions in the uncertain world in which we live. Breaking Bayes' theorem down in depth is beyond the scope of this post, but thankfully there are great explainers already available such as An Intuitive (and Short) Explanation of Bayes' Theorem15 and Bayes theorem, the geometry of changing beliefs (3Blue1Brown)16.
Now, let's apply Bayes to the Monty Hall Game...
Since Bayes helps you figure out how likely something is, we first need to figure out what it is we're measuring the likelihood of. In the game, the player is trying to figure out how likely it is that each door contains the prize. In our case there are three possibilities, the prize is either behind Door 1, 2, or 3.
| Prize Location | Probability written as... |
|---|---|
| Door 1 | P(prize behind door₁) = P(Prize₁) |
| Door 2 | P(prize behind door₂) = P(Prize₂) |
| Door 3 | P(prize behind door₃) = P(Prize₃) |
Conditional Probabilities
We need to include the events of our specific game when we calculate bayesian probabilities, events such as which door we choose and what door Monty opens. We use these events to ask specific questions that we need probabilistic answers to, such as "what is the chance that the prize is behind Door 1, if I chose Door 1 and Monty opened Door 3?" These are called conditional probabilities17 and are written as P(A|B), which looks complicated but simply means "if event B just happened, what is the chance that A happened too?"
The intuition behind conditional probabilities is simple and something we do all the time without much thought. We all make decisions based on things we observe when we're not completely sure what the outcome will be. Probabilities are just about putting a mathematical number to our uncertainty, and conditional probabilities are how we incorporate any surrounding context to get better answers. Uncertainty is a very deep and important topic that we'll come back to in this article, but is also one I'll be writing a lot more about over time.
To make it more concrete, here are a few examples, including one about the Monty Hall Problem:
| Question | Observation | Conditional Probability |
|---|---|---|
| Should I wear a rain jacket? | It's raining outside | P(should wear jacket | it's raining) |
| Should I charge my phone? | Phone on 8% battery | P(should charge phone | 8% battery) |
| Is the prize behind Door 1? | Monty opened Door 3 | P(Prize₁ | Monty opened door 3) |
We then plug the values into Bayes theorem, which is as follows:
Bayes Theorem
| Symbol | Technical Name | The probability that... | Example |
|---|---|---|---|
| P(Explanation | Event) | posterior probability | your explanation is correct, given the event happened | If Monty opens Door 3 and shows you it's empty, the probability that the prize is behind Door 3 is obviously zero i.e. P(Prize₃ | Open₃) = 0 |
| P(Event | Explanation) | likelihood | the event would've happened given your explanation | If you believe the prize is behind Door 1 (meaning you chose Door 1), the odds that Monty opens Door 2 or Door 3 is 1/2 i.e. P(Open₂ | Prize₁) = P(Open₃ | Prize₁) = 1/2 |
| P(Explanation) | prior probability | your explanation is correct, regardless of the event | P(Open₂ | Prize₁) = P(Open₃ | Prize₁) = 1/2 |
| P(Event) | marginal likelihood | the event would happen for any possible explanation | If you believe the prize is behind Door 1, so you chose Door 1, and then Monty opens Door 3, the probability that Monty opened Door 3 is the combined probabilities for the prize being behind Door 1, Door 2, or Door 3. |
Bayes Applied to Monty Hall
We'll now apply these Bayesian techniques to the Monty Hall problem.
Scenario: The specific game situation we'll use is that you choose Door 1, Monty opens Door 3, and then you have to choose between keeping Door 1 or switching to Door 2.
Without Loss of Generality
While we will only run through the calculation for this one specific game scenario to show switching is best, the math applies equally to all the other ways you can play the game, whether you pick Door 1, 2, or 3. In math this is called "without loss of generality" (WLOG) which applies here because of symmetries in the gameplay. We can assume that no one door in the game is preferred by Monty when he places the prize or when he opens a door. This is similar to fair dice being rolled, choosing a card from a "fair" and sufficiently shuffled deck of cards, or a balanced roulette wheel that gives equal chances of the ball landing in any pocket.
Step 1: You choose 'Door 1'
At the beginning of the game you have no clue where the prize is. This means that, from your perspective, all the doors are equally likely to contain the prize, making your choice equivalent to rolling a 3-sided die. This assumes you have no useful knowledge about the game, such as maybe knowing that Monty prefers to put the prize behind one door over another. For this analysis we'll assume it's a fair game and Monty's door choice is fully random. Since there are three doors, all doors have a one-in-three chance of containing the prize. In our scenario, you choose Door 1.
Step 2: Monty opens 'Door 3'
Your choosing Door 1 affects what Monty does next. In our scenario Monty opens Door 3, which becomes a Bayesian event we can use to help figure out what we should do next. To use this event in our Bayesian inference, we have to figure out what the probabilities were for each of Monty's possible actions:
- Prize behind Door 1: if the prize is behind your chosen door (Door 1) then Monty will randomly choose between opening Door 2 or Door 3, which is a 50/50 decision.
- Prize behind Door 2: if the prize is behind Door 2 then Monty must open Door 3 because he can't open the door you chose (Door 1) and he can't open Door 2 because that would reveal the prize location
- Prize behind Door 3: We know the prize isn't behind Door 3 because Monty opened it already. The probability that Monty would open Door 3 if the prize were behind it is zero.
Step 3: Switch or Keep?
By opening Door 3, Monty has given us useful information that we can use to decide whether to switch to Door 2 or keep Door 1. To use this new information, we can plug it into our Bayesian formula and it will tell us the probability of winning for each choice. However, there's still one value we haven't yet calculated, the marginal likelihood. The marginal likelihood is the combined probability that Monty would have opened Door 3, wherever the prize is, and is written as P(Open₃). The math of the marginal likelihood is a bit longer to write out and calculate than the previous values we've done. Thankfully, it just combines values we already calculated in steps 1 & 2 with some addition and multiplication. So it's fairly straightforward to write out below:
As shown in the table below, Bayesian inference says that switching to Door 2 gives us a 2/3 chance of winning, versus a 1/3 chance if we stick with Door 1.
| Prize behind... | P(Win) | Equations |
|---|---|---|
| Door 1 | 1/3 | P(Prize₁ | Open₃) = P(Open₃ | Prize₁) · P(Prize₁) / P(Open₃) = (1/2 · 1/3) / (1/2) = 1/3 |
| Door 2 | 2/3 | P(Prize₂ | Open₃) = P(Open₃ | Prize₂) · P(Prize₂) / P(Open₃) = (1 · 1/3) / (1/2) = 2/3 |
| Door 3 | 0 | P(Prize₃ | Open₃) = 0 |
Further Analysis - How much do we learn from Monty?
The million doors explanation got me thinking about how much useful information the contestant learns when Monty opens the non-prize doors, where useful information means knowledge that actually helps the contestant win. Intuitively, it feels like the contestant learns more when Monty closes 999,998 doors over when he just closes 1. Thankfully, there's a mathematical way to model the amount of information the contestant gains using a branch of math called information theory7.
Information theory is a fascinating area of math that was created in 1948 by Claude Shannon8 with his seminal paper A Mathematical Theory of Communication9. While he isn't a household name, Shannon is one of the most important and influential figures in science and mathematicians from the 20th century. Information theory lets you measure both the amount of information available (i.e. knowledge) when making a decision and the amount of uncertainty you're dealing with. The ability to measure information and uncertainty have become foundational pillars for modern Artificial Intelligence, particularly for large-language models (LLMs) like ChatGPT and Gemini. In information theory, uncertainty has the fancy name entropy, so if you see the word entropy here just read it as uncertainty. The reason for it being called entropy is mostly just because the math works very similarly to something called entropy in physics10.
Information theory helps us measure how much knowledge we gained when Monty opens the non-prize containing doors through something called information gain11. To measure the information we first calculate the uncertainty before Monty opens the doors, then calculate how much uncertainty we have afterwards, and then just find the difference.
Uncertainty (entropy) formula
There is a standard formula for calculating the uncertainty of a situation. It uses a few fancy symbols but, in essence, the formula has you add up the probabilities for each possible outcome multiplied by how surprising it would be for that outcome to happen. The table below describes the three core parts of the formula:
| Term | Symbol | Explanation |
|---|---|---|
| Sum | ∑ | This is just a quicker way to say you're adding lots of things together. For example, if you want to add together all the numbers between 1 and 100 you can either write out 1+2+3+4+5+...+100 or ∑(n=1 to 100) n, they mean the same thing. |
| Probability | P(event) | As described earlier, the probability of an event is written as P(outcome) and the probability is always a number between zero and one. P(outcome)=0 means it'll never happen and P(outcome)=1 means it will definitely happen. |
| Logarithm | log(x) | Surprise is measured using something called a logarithm. The details of logarithms aren't important here; the main interesting parts are that they grow slowly and are negative when the input is less than one. |
Calculating information gain
Let's start off by working out the information gained for the regular, three-door version of the game. We'll take a situation where the contestant has chosen door 1 and then Monty opens door 3. This leaves the contestant with the choice to keep door 1 or switch to door 2.
Uncertainty before
The first step is to calculate the uncertainty before the doors are opened. To do that we just need to know the probability that each door might contain the prize. Before Monty opens one of the doors, each door is equally likely to contain the prize, as such, each door has the same 1 in 3 chance of containing the prize, so:
We can then just plug those values into the uncertainty formula. A couple of notes on this, first we'll replace door with d for brevity, and second, the unit of measurement we get at the end is in bits. We get bits because we're using log₂ which is a standard measurement of information. Putting this together it looks as follows:
Uncertainty After
The probabilities after Monty opens one of the doors are listed below. Those values can be plugged straight into the uncertainty formula to calculate your uncertainty when choosing to switch or keep:
Information gain when Monty opens doors
Calculating the information gain is simply then just finding the difference between the before and after values. This difference gives us a concrete value for how much we learned when Monty closed one of the doors (since we've worked through the three door game). It's important to note that we used the probability values from the Bayesian analysis above (e.g. 2/3 for switching), so this information gain doesn't tell us those probabilities are correct, just measures how much information we gained.
Is there more certainty with a million doors?
We can now circle back to run math on the intuition from Explanation #1: A Million Doors. The intuition was that we gain more information from Monty's actions in the million doors version of the game, such that we can switch with more certainty. We'll use the situation that the contestant chooses door 1 and then Monty opens all the other doors except for door 2. Leaving a choice between door 1 (keep) and door 2 (switch). If our earlier intuition was correct, we should see the information gain being much larger than with three doors, and that the uncertainty trends to zero the more doors are opened.
I worked through the equations for a million doors, then generalized the calculations for any number of doors. I turned the data into a graph and data table that you can toggle between below. The data end up being quite clear that our intuition was correct, you are way more certain about switching in the million door case. In the million doors example, after Monty opens 999,998 of the doors, your uncertainty about where the prize is drops effectively to zero.
Information Gain When Monty Closes 999,998 Doors
Setup: n = 1,000,000
Uncertainty Before:
P(Prize₁) = P(Prize₂) = ... = P(Prizeₙ) = 1/n
Uncertainty After:
P(Prize₁) = 1/1,000,000, P(Prize₂) = 999,999/1,000,000, P(Prize₃) = ... = P(Prizeₙ) = 0
Information Gain:
Formulas:
uncertainty_after = -(1/n) · log₂(1/n) - ((n-1)/n) · log₂((n-1)/n)
information_gain = -log₂(1/n) - uncertainty_after
Parting Thoughts
Working through all the different explanations and intuitions for the Monty Hall problem sent me down some fun and interesting paths. The net result, after all the digging, was that I not only ended up with a deeper intuition about the problem, but also expanded my knowledge of and confidence in related areas. The core takeaway of "always switch" is simple. But a key takeaway for me was that Monty is using his own, private knowledge in ways that create signals to you (the player) which work to your advantage. This is an important intuition to bring into decision making in our everyday lives.
FAQs (Please suggest new ones)
Q. How does this relate to multiple choice questions?
The fact that you should always switch in the Monty Hall Problem seems like it might apply to picking answers on a multiple-choice question. However, on further reflection they're not equivalent.
Let's frame the difference through another game show. Imagine a game-show like "Who wants to be a Millionaire?" where you have a multiple-choice question and various "lifelines" that help you figure out the answer. A first key difference is that in multiple choice questions, you're often not just guessing and your own internal knowledge (priors) affect the probability of each answer. Even if you're not well-versed in the subject domain of a question, you can apply other general knowledge that might be useful. For example, if the question was "who was the 32nd president of the USA?" and the available answers were a) Washington, b) JFK, c) FDR, d) Clinton. You might not know if it was JFK or FDR, but most Americans would know it's definitely not Washington (given he was the first) or unlikely to be Clinton (since Clinton was fairly recent and the current president is the 47th). In the Monty Hall game, it's constructed such that the contestant has no clue where the prize is and the only signal they get is from Monty's actions.
Now, if you're playing "Who wants to be a Millionaire?" and are truly guessing because you have no clue what the answer is, then the game starts to become more like the Monty Hall problem. For example, consider the situation where:
- you understand the rules of the game: it's multiple choice, you have lifelines, etc.
- you're faced with a question but the answers are in a foreign/alien language that you can't read
- you randomly picked answer b
- before committing to b you decide to use your 50/50 lifeline, which causes two of the wrong answers to be removed (e.g. a and d)
- you now have to decide between keeping b or switching to c, should you switch?
This situation is effectively the same as the Monty Hall Problem and you should switch. However, you have to conjure up extreme, unrealistic examples of the Millionaire game to make it similar to Monty Hall. In Millionaire the questions and answers themselves are designed to provide enough information to pick the right answer, as long as you have the sufficient background knowledge (priors in the Bayesian sense). If you had knowledge/priors about where the prize was in Monty Hall, you're effectively 'cheating' in a way that violates the spirit of the game.
Read the full article with interactive games and visualizations at: jameshorsley.me/articles/monty-hall-problem