Monty Hall Dilemma
Probability puzzles require you to weigh all the possibilities and pick the most likely outcome.
Suppose you are on a game show. The host puts two envelopes in front of you, telling you that one contains twice as much as the other. You select envelope A, reach inside, and there is $200. Now, the host offers you a chance to switch whatever you picked for the other envelope. Do you?
First, you must think about this: The other envelope either contains $400 or $100. You have a fifty/fifty chance of either gaining $200 or losing $100. The odds are playing in your favor. Here is the strange part though: You would have arrived at a similar conclusion if you had picked envelope 2. Personally, I would stick with what I have because it is still a 50/50 chance of increasing your total. What you have run into is the famous two-envelope paradox. If you have any better reason for switching, leave a comment.
Apr 25, 2002
|I would say leave it because you chose A a.k.a. no.1. He said that ONE contains more than the other.|
Apr 26, 2002
|I think I would switch as I am greedy!|
But, its a 50/50 chance wether the other one contains more or less, but its really a choice of winning $200, or losing $100 with a 50/50 chance. I think the PAYOUT works in your favour, 2:1? If it was $100 or $300 I'd probably stick I think thats even chance.
Or if it was in the game show situation, I think you would have to consider the type of show, its track record for giving out money, and how much was at stake. For instance if it was $200 000 I'd probably stick because there is too big a drop for me!
Apr 26, 2002
|cath, did you not see the part about one having twice as much as the other?|
Apr 27, 2002
|Although it's a 50/50 chance, you have to look inside the mind of the show's organizers. My opinion is that they like nice round numbers. Since the one you opened contained $200, I would think that they started at $100 and then you opened the envelope containing twice the amount. $400 is just to odd of a figure to be in the next envelope, since they would have to start with a figure of $200. I think most prizes are in the form of $50, $100, $500 and $1000....but then again the organizer might try to out think me and.....|
Apr 30, 2002
|Yes I did see the bit about twice as much, but I'll put it like this. One contains 100 times as much. You pick out $100. SHould you swap?|
You have the potential to only get $1, or $100 000.
What would you do then?
This problem is about the fact that your chances of winning more or less are 50/50. But, you have a chance of winning more money than you would lose, even though its a 50/50 chance.
If you were playing the odds, you should swap. The average payout (if you already have $200) if you swap is 100+400/2 = 250 I think. SO if you were picking the best odds, you would go for it.
The funny part is that the odds are the same if you pick the higher one. This is still because the odds are 50/50 of winning losing, but you will win more than you might lose
Apr 30, 2002
|By the way sorry for adding the bit about what game show it is. Thats not what this puzzle is about, but if you were in the same real life situation, they are the real things you way also consider to help you improve your odds!|
May 08, 2002
|Personally, I would swap - I assume I would get to keep the money found in the envelope and whether it's $100 or $400, it's more than I started with!|
May 08, 2002
|no. I'm not asking the odds.|
May 13, 2002
|If your not asking about the odds I don't know what you are on about.|
In relation to the odds; the paradox is the odds are the same ....
If you are trying to guess a random 4 digit pin number, how many guesses it will take you to find it? There are certain set odds. You may get it first time, which is 1/10000 but its still a possibility. Say you just start guessing 0000, 0001, 0002, etc all the way to 9999. It doesn't matter wether the number you are trying to guess is 0001, or 9998 the odds are still the same. Just because it might happen sooner is "luck" Applying the same line of thinking to this puzzle: should you start at 9999 and work backwards or 0000 and count up? Even if the number is say 0001, it doesn't matter because the odds are the same. You don't know what the number is, so you can't say one way is better than the other which is better. Probability wise they are the same
May 17, 2002
|The comments are more fun than the teaser|
May 21, 2002
|This gets more interesting if there are 3 envelopes, one with the 'jackpot' prize, and two with nothing in. Once you select your envelope, the host opens one of the remaining 2, which is empty (since there will always be at least 1 remaining empty envelope after your selection, and the host knows which contains what) Should you now stick, or swap to the third envelope? Surely, you have a 50/50 chance of the jackpot whatever, right? No, if you swap, you have a 2/3 chance of jackpot (since 2 out of the 3 times, your original pick would be wrong, and the other wrong answer has been eliminated), if you stick, only a 1/3 chance. Sorry if this is an old one, though.|
May 21, 2002
|thats already on this site|
May 21, 2002
|thats already on this site|
Jul 01, 2002
|hmmm. ever heard of scroedinger's cat? this is the same paradox: you simply cannot know until you have opened both envelopes, and until you do the other contains BOTH half as much and twice as much.|
Jul 05, 2002
|Yeah thats true, but you have a 50/50 chance of winning 200 or losing 100. |
Because the pay out is twice as much for winning I think if you are playing the odds you would have to go for the other envelope. if it were win 100 or lose 100 then I think you are equally justified to either stick or gamble.
Jul 06, 2002
|true, but if you had picked the other envelope to start, you would come to the same conclusion|
Jul 10, 2002
|ok. let me try to explain the problem as simply as possible. there are two envelopes. one contains twice as much as the other one. you pick an envelope, and it contains $200. you could either have the one that is twice as much as the other one, thus making the other one $100; or you could have the smaller one, the one that is half as much as the bigger one, thus making the other envelope $400. either way, you have a 50/50 chance on winning more, or getting less. |
Jul 19, 2002
|I know but if you gamble you take two things into consideration. What is your chance of winning, and how much will you win?|
Put the problem like this, if the other envelope either contains £100000000 MORE or £1 less, your odds of losing are still 50/50 and the odds of winning are still 50/50 but I think EVERYONE would gamble.
All I'm saying is I think the payout is worth the risk. win 100 or lose 50 and the odds are 50/50 or either.
Aug 01, 2002
|The (potential) fallacy in this problem is the assumption that it's 50-50 whether you gain or lose by switching. That's not necessarily so.|
Pusandave has the right approach - what you really need to figure out is whether or not the producers are equally likely to have put $300 as $600 in the two envelopes.
Just because there are two unknown possibilities doesn't mean that they are equally likely - that's the "Life on Mars" fallacy. (Either there is life on Mars or there isn't, and we
don't know which. So it's got to be 50-50, right? Wrong.) You win $200 when the producers chose to put $600 in the two envelopes, and you lose
$100 when they chose to put $300 in the two envelopes. What are the relative probabilities of those two alternatives? That's what you have to figure out
if you are ever faced with such a situation for real.
Dec 18, 2002
|The answer is you should switch envelopes based on the Expected Value theory, that is:|
The other envelope contains either $100 or $400 and there is an equal chance of either possibility.
Therefore the expected value you receive by keeping your current envelope is $200 (given) and the expected value you receive by switching envelopes = 50%*($100) + 50%*(400) = $50 + $200 = $250
Dec 20, 2002
|Yes but that logic should be immediatly disproved because you would have come to the same conclusion if you had picked the other envelope.|
Jan 03, 2003
|The solution is not to give a better chance of picking the correct answer about switching, but the expected value you gain or lose by switching.|
There are a few ways to look at this:
1) run the simulation 100 times and with a 50/50 shot you will be right 50% of the time and you will be wrong 50% of the time.
50 times you would end up with $100 = 50*$100=$5,000 and 50 times you would end up with $400 = 50*$400=$20,000.
Now the average of these 100 simulations is ($5,000 + $20,000)/100 = $250. So if you switched 100 times and 50 times you were wrong and 50 times you were right, then in the long haul after the 100 simulations you would average $250 which is greater than the $200 you are guaranteed.
However, the answer would be different if the options were $200 to keep the envelope and either $250 or $50 for switching.
2) The second way to look at this is that you are gaining more than you are risking by switching. Taking that to the extreme say you were still guaranteed the same $200 for keeping your envelope, but to switch you would either win $100 or $10,000. As you can see you are gaining much more than you are losing with switching envelopes (providing a 50/50 shot).
Trust me, I have a bachelors in Probability & Statistics and a Masters in Applied Mathematics and in my coursework we have done this type of problems many times.
It all has to do with what you can gain vs. what you can lose. In this case you gain more than you lose.
Jan 03, 2003
|YOU ARE MISSING THE WHOLE POINT!!!!! You thinking is logical, but it is all disproved when you think of the fact that if you had chosen the other envelope to start with, you would reach the same answer!|
Jan 03, 2003
|The answer I gave is correct if the question is how to win the most amount of money after multiple simulations of the same game, then as I have shown it is in the best interest to switch envelopes (not for a single instance, but multiple over time).|
The answer you give on the site gives your personal opinion as to why not to switch, such as "no matter what the gain, I don't want to risk losing any portion of the $200".
There is no mathematical or probability reasoning for your answer.
You are correct that you would arrive at the same answer had I picked the other envelope, but over multiple random simulations more money would be won by switching envelopes.
Mar 14, 2003
|Dewtell's comment is the most pertinent here (and to many other of the doubtful probability solutions posted in this category). Mathematical probability counts equally likely outcomes. I see no mention in this teaser that there ever needs to be an amount of $400. There are 2 envelopes: One has $100 and the other has $200. The conditions is satisfied, the host has told the thruth. Uless you know something else, there is no mathematical way that you can work out a probability because you know nothing about whether $400 (maybe never) or $100 are equally likely. In the solution provided you have stated that the $100 and $400 are a 50/50 chance. Did you leave that information out of the teaser? There is nothing that suggests that in the wording of the teaser. If that is the case (50/50) then Predovich's comment is correct and you would opt for the change. This is a simple problem. On the other hand if you take it at face value you would have to suppose that the $400 might appear 1 in a million times (almost never) since you don't know anything about how likely it is. In that case you keep the $200 envelope.|
Jul 20, 2004
|I would switch. Either way you're not losing any money, you're just gaining $100 or $400.|
Nov 29, 2005
|Sorry if I'm sparking more, but... Mogmatt16, you say that since you would have the same argument regardless of which you chose, the argument is flawed. How exactly does this fact make the argument flawed? As for your comment, Jimbo, it's true that you do not know anything about which is more likely. So you'll have to estimate. If you estimate 100 to appear more than twice as much as 400, you would stay. If you estimate 100 to appear less than twice as much as 400, you would switch. If you estimate 100 to appear exactly twice as much as 400, you would be indifferent.|
Mar 02, 2006
|You guys are trying to apply probability-based decision on a one-time deal. It does not work that way. Probabilities work for large numbers only. So, if you are able to repeat the enveloppe experiment more than 30 times, you're better off switching, since your pay-off is guaranteed better (that's why casinos thrive). But if you are offered the switch only once, the problem becomes purely theoretical. If you switch, you have a 50% chance to get $200 richer, and a 50% chance to get $100 poorer. Payoff calculation makes little practical sense at all on a one-time deal. So I guess what you have to ask yourself is: do I feel lucky? well, do you, punk.|
Mar 26, 2006
|I doubt that is true YVAU. I give you a one time chance to either take a ticket in a lottery with 100 000 tickets and one winner ($100 000) or flip a coin and win $100 000 if it comes down heads. You say that probability would not influence your decsion because it is a one off? Anyway Mogmatt you said give you a reason to change. Expected value is Probability times Return. If I keep the $200 I expect to keep $200. If I change I expect 50% x $400 + 50% x $100 = $250. Since my expected return is higher if I cahnge, then I should change (provided we are guaranteed that teh organisers bot in $600 as often as they put in $300). The fact that changing is better whichever amount I open does not necessarily flaw the argument.|
Jul 27, 2006
|Peronally I would swap because the most obvious odds are in the favor of the swap. I cooould think things through and decide after some deliberation, but most game shows have an average of a 1 hour time limit, and if I'm at the point of choosing my prize I'm probably in the bonus round which is usually located near the end of the show. If I took time to think things through then I'd probably run out of time and have to come back next time, which leaves waaay to much to chance as far as something preventing me from reappearing. |
Mar 03, 2007
|Read "The strange incident of the dog in the night time". It is a book about an autistic math savant. It explains this problem simply (yet understanding it fully is still difficult). In a nutshell, the odds are mathematically better to switch.|
Sep 29, 2007
|uh the monty hall problem deals with switching in a 3 choice game, after one choice is canceled|
here the probabilities are still 50 percent for each though the reward is better on one..
if for example the better envelope contained 10 more dollars but you had a chance of losing 100.... would you still switch?
Jul 26, 2008
|Consider this. You open one, while a friend opens the other. You reason that you will, on average, win by switching. SO DOES YOUR FRIEND. So, if the reasoning is sound (you are correct), you will both, on average, win. Contradiction.|
Jul 29, 2008
|Those of you who think you should switch should consider this argument: |
There are two possible strategies. One: Open the first envelope, and keep the results. Two: Open the first envelope and switch to the second.
If you follow strategy one, the amount in the first envelope is x, so the expected amount in the second envelope is 5/4*x. Therefore you should always switch after opening the first envelope.
But since you are going to switch envelopes regardless of what is in the first envelope, you don't even have to look in the first envelope before deciding to switch.
Thus your expected value of the game must be equal regardless of whether you open one envelope or if you switch envelopes.
Where is the contradiction?
Aug 05, 2008
|bob45424, you hit the nail on the head. This is like the wallet thing, where you bet a buddy whoever has the least amount of money in their wallets wins the contents of both.|
This is the two envelope paradox and NOT the Monty Hall problem. The Monty Hall problem has an easily understood solution that switching after the goat is revealed increases your chance of winning the car to 2/3 if Monty knows where the car is and always reveals a goat . No paradox. No contridictions.
In the two envelope problem there is a true paradox present here. If you and a friend each have an envelope and one is twice is much than the other you should always switch because you have a 50% of gaining twice as much and 50% of loosing half as much. Assume any amount is in you envelope, lets say $100. You can gain another 100 or loose 50 of it. So your expected value of the bet is greater than 1 and you should switch. The paradox becomes evident when you consider your friend can use the exact same logic and conclude he should also switch. A bet can not be advantagous for both players. There lies the paradox.
In the case of this teaser whats really happening is the same thing. Ultimately one has to try to determine if there exists some "non-uniform prior distribution" of sums that the amount in the envelope was selected from. In other words, does an upper limit exist? And if so, can you estimate what it is and therefore use this information in your decision to switch. If you conclude for whatever reason that the distribution of sums from which the amount in the evevelope has been drawn is likely from $1 to $100, and you open it and find $98 then you should NOT switch. However if you conclude that there exists an infinate prior distribution of sums, then you should always switch regardless of the amount in the envelope.
To add one more wrinkle to all this consider this quote on the topic:
"The logician Raymond Smullyan questioned if the paradox has anything to do with probabilities at all. He did this by expressing the problem in a way which doesn't involve probabilities. The following plainly logical arguments lead to conflicting conclusions:
1. Let the amount in the envelope chosen by the player be A. By swapping, the player may gain A or lose A/2. So the potential gain is strictly greater than the potential loss.
2. Let the amounts in the envelopes be Y and 2Y. Now by swapping, the player may gain Y or lose Y. So the potential gain is equal to the potential loss."
So tell me now there is no paradox here. There is certainly a logical contradiction to the problem.
Aug 05, 2008
|I forgot to go back and touch base on the "wallet bet".|
Suppose you and another man are sitting at a bar. The bartender comes up and suggests the two of you engage in a wager where you compare the contents of your wallets and whoever has the LEAST cash wins the contents of both wallets.
Its the same as the two envelope problem if niether of you have a good idea exactly how much you have in your own wallets. Then using the logic of:
I have a 50-50 shot of winning. If I loose, I can only loose whats in my wallet, but if I win, I will certainly win more than that. Maybe much more! So I shouold take the bet.
The other guy uses the same logic and also takes the bet. Again, both players cannot possibly at the same time have an advantage on a bet, yet in this case, the plainly simple logic says they do! A paradox.
But now lets say you know you have like between $200 and $300 in your own wallet. Then you may consider not taking the bet especially if the guy looks like he may not have much money. But lets say you noticed the guy wearing a rolex, nice suit, and his wallet looked kinda fat when he pulled it out earlier and he mentioned ealier he just left a casino where he did "pretty well". Then you should consider taking the bet because based on all this information you can reasonably assume he has more money than you even if you have about $300 on you yourself.
You see, its all about the information you can obtain to help you make a decision to switch.
Lets take an example closer to the teaser.
You are on a game show and they offer you two envelopes. You pick one at random and open it up and find $10,000. The host tells you one of the evelopes contains twice as much as the other. Do you want to switch?
Normally you should because you have equal chance of winning 10 thousand or loosing 5. But you have reguarly watched this show on tv and have noticed that no contestant has ever won more than $10,000 in any game they have played on the show. See, now you have information you can use. You decide not to switch because it is unlikely that you will win twice as much as anyone else has even won that show before. You established a prior distribution with $10,000 as an upper limit.
Again, the only time you always switch regardless of the amount in your envelope is when you can assume there is an infinite, uniform prior distribution of sums from which the contents of the envelopes were drawn from. Which in real life would never be truely the case.
What to me is more interesting is when you have two envelopes and are not to look in either of them until you make your final pick. Once you pick one, you can use the same logic to conclude its to your advantage to switch. "Its 50-50 and I can win twice as much than I can possibly loose."
But as soon as you switch you use the same logic to conclude you should switch back to your first pick. Then, when you make that switch you should again switch again until you are just switching back and forth between envelopes never deciding on either one. There absolutely nothing wrong with your logic, and yet its insane to carry it out. So wouldn't it be more logical to just randomly choose one? Yes and no. Again a paradox. Its just common sense that you should pick one and take the money.
Aug 11, 2008
|WRONG! WRONG! WRONG! THE ANSWER IS VERY WRONG! The amounts in the envelopes can't be completely randomly unlimited between zero and infinity because that would mean that the average amount in the envelopes would be infinite. |
This means there must either be an upper limit of the amounts or the the amounts are weighed so it's more likely to get a low amount than a high amount.
So it is more likely that you'll get a smaller amount than a higher when selecting the other envelope. Best strategy is to switch if its a low amount and keep the envelope its a high amount.
Oct 10, 2008
|Pixit, how can you say "Best strategy is to switch if its a low amount and keep the envelope its a high amount.". How do you define "low amount" and "high amount"? Since you have no way of knowing whats in the other envelope, you have no way of knowing if what you have in your hand is "low" or "high". If you knew that the envelopes had $100 and $200 then of course if you picked the $200 one you wouldn't switch, likewise if you knew that it was a choice of $200 and $400. The point is you DON'T know if you are playing a $100 + $200 game or a $200 + $400 game. |
OK here is why it is better to switch:
You are either playing a game which has $100 in one envelope and $200 in the other (lets call it Game Low), OR you are playing a game where there is $200 in one envelope and $400 in the other (lets call it Game High).
The thing to remember here is that you don't know which game you are playing.
The ideal situation here is to WIN Game High, that is how you make the most money.
Now think about this... By switching your outcome MUST be that you will either WIN Game High, or you will LOSE Game Low.
The difference between winning and losing Game Low is less than the difference between winning and losing Game High! So you are basically wagering that you think you are playing Game High, and have chosen the lower value, and hence switching will bring you a larger reward than you will lose if you guessed wrong, and you are actually playing Game Low.
Jan 05, 2009
|You can switch if you want to, but, I have to tell you that it doesn't increase your "expected winnings".|
It doesn't decrease it either.
Think about it this way. This is a win, win game, you can't lose, you can only win. And you are just as likely to win the larger amount, or the smaller amount, whether you switch or not. It simply doesn't matter.
We trick ourselves when we start considering the amount we lose when we swith from the large prize to the small prize as opposed to the amount we gain if we switch from the small prize to the large one. Again there is no loss possible. Only a 1/2 chance to win big and a 1/2 chance to win small whether we switch or not.
May 12, 2011
|Call it roulette (without the 0's) - would you bet $100 on red (or black) if you would either lose the $100 if wrong or win $300 if right? If someone gave you that $100, I don't see why you wouldn't. |
Back to Top