Let's Go Out to the Movies
Probability puzzles require you to weigh all the possibilities and pick the most likely outcome.
Gretchen and Henry were sent to their rooms for fighting in the house. They each separately voiced their protest to their father, insisting that the fight was nothing more than healthy sibling competition, and they each wanted to go out that afternoon to see a movie. He was moved by their stories, but wouldn't simply set them free. Instead, he devised a system. He went to each child's room with a penny, and told them that they would have to show up in the den in 10 minutes, and could choose to bring the penny with them or leave it in their respective rooms. Dad would then flip the one or two pennies brought to the den, and if the pennies he flipped came up heads, the kids could go to the movies. If neither brought a penny, or if he flipped at least one tail, they would stay in their rooms until supper time.
The problem facing Gretchen and Henry was that neither knew what the other would do. It would be easy if they could collude -- one would bring a penny, and the other would not, giving them a 50% chance of going free -- but they did not have this luxury.
If they both acted optimally, what is the probability that they will be free in time to see the movie?
HintIf they both bring a penny 100% of the time, they will never be sent back without a flip, but they will only have a 25% chance of going free. Can they do better? Acting optimally means they will have to pursue some course of action some percentage of the time.
They should each bring the penny with probability 2/3, and they will go free 1/3 of the time.
There are a couple of ways of tackling this. The first is to say that what is optimal for Henry must also be optimal for Gretchen, so whatever probability one has of bringing the penny should equal the probability that the other brings the penny. If we set that probability equal to 'p', then the probability that they go free ('f') is:
p^2*(1/4) + p*(1-p)*(1/2) + (1-p)*p*(1/2) + (1-p)*(1-p)*0
Since the last term goes to 0, this leads to an equation of:
p - (3/4)*(p^2) = f
From here, you could either plug in values of p from 0 to 1, finding that f is at its greatest (1/3) when p is equal to 2/3, or you could use calculus.
Warning: CALCULUS FOLLOWS!
Since this is a quadratic equation, the value will be at its maximum or minimum when its first derivative equals 0. The first derivative of f with respect to p is:
df/dp = (-3/2)p + 1
Setting this equal to 0, we get p = 2/3.
To determine if this is a maximum or a minimum, we need to take the second derivative:
d2f/dp2 = (-3/2)
Since the second derivative is negative, the value we found by setting the first derivative equal to 0 is a maximum, and we have our answer.
CALCULUS IS OVER
Another way we can solve this is to say that each needs to make a decision that will equalize their chance of survival REGARDLESS of the action taken by the other.
This point, where you get the same result regardless of the other person's actions, is known as the "Nash Equilibrium Point". The brilliant mathematician John Nash (who was made famous in the book and movie "A Beautiful Mind") showed that this point of equilibrium will always give the maximum overall result, both in a cooperative game like this one and in a competitive game such as poker, as long as the correct strategy is a mixed strategy like this.
So if Henry assumes that Gretchen will bring her penny with her none of the time, all of the time, or somewhere in between, he wants to pick a probable course of action (in game theory, a "mixed strategy") that will maximize his chances of going free. Let's just look at the two extremes (i.e., Gretchen will either bring her penny 100% of the time, or 0% of the time).
If Gretchen brings the penny 100% of the time, and Henry brings his penny with probability p, they will go free with probability:
p*1*(1/4) + (1-p)*1*(1/2)
If we assume Gretchen never brings her penny, then they will go free with probability:
Since Henry wants to be indifferent to Gretchen's actions, we can set these two probabilities of going free equal to each other. We then get:
(p/4) + (1-p)/2 = p/2 [setting equations equal]
(p/2) + (1-p) = p [multiplying through by 2]
1-p = (p/2) [subtracting p/2 from each side]
1 = (3/2)p [adding p to each side]
p = 2/3 [solving for p]
Gretchen will determine the same probability of bringing a penny by using the same logic.
Thus, they will each bring a penny 2/3 of the time, maximizing their joint chances of going free at 1/3.
Jun 25, 2006
|The answer was confusing!|
Jun 27, 2006
|The probability of me trying to do another probability teaser after this one is 0...(hee hee)|
Jun 29, 2006
Jul 03, 2006
|i didn't understand even a thing from the solution given. really confusing. i don't understand what does "act optimally" means. so that i just try to find the probability they'll have free-time. i have 5/16 for the ans. really difficult teaser |
Jul 06, 2006
|This wasn't as hard as you put it. I got it right but i think i went cross-eyed after reading ur answer! |
Jul 07, 2006
|They have to make a choice of either bringing a penny or not bringing a penny. They cannot bring a penny on 2 out of 3 occasions because there is only one occasion. To make sense out of this answer you would have to say that they had to report to the den every 10 minutes for the next hour or some such construct which would allow them to make choices of bringing or not brining the penney.|
Jul 07, 2006
|I disagree, Jimbo. While the end result is that they either bring a penny or not, they can do this probabilistically; for example, they could glance at a clock, and decide to bring a penny if the second hand is in the first 40 seconds, and not if the second hand is in the last 20 seconds. This would cause them each to bring a penny with probability 2/3.|
Jul 08, 2006
|Completely agree with Brainjuice. |
Didn't understand the question (act optimally???).
Hadn't a clue what the answer meant.
Worked out that there chances of being free were 5/16.
Aug 09, 2006
|You don't need calculus. A little bit of Algebra with Parabolas is all you need to find the maximun for f.|
Sep 01, 2006
|Very good teaser, thank you.|
Sep 13, 2006
|THIS WAS SO CONFUSING!!!!!!!!!!!!!!!! I DIDNT UNDERSTAND THE ANSWER AT ALL. IS THIS SOME KIND OF PRACITAL JOKE OR SOMETHING? |
Oct 21, 2006
|Maybe they should just flip their penny to decided if they should bring their penny.|
Oct 21, 2006
|Gretchen would probably just bring her penny no matter what because she wants to make sure Dad will have a penny to flip because that's how little baby sisters think. It's how big sisters think too. (I shouldn't assume she's the baby, but all girls are babies anyways) So I would leave my penny.|
Nov 10, 2006
|Psychology really has nothing to do with this, but whatever|
Nov 22, 2006
|Nolo contendere. Here's what I did:|
I separated penny-bringing from penny-flipping.
Penny-bringing has three outcomes: both bring a penny; only one brings penny; both don't bring. Therefore, there is a 1/3 chance of those things happening.
Penny-flipping has three outcomes:
two heads; two tails; head and tail. Therefore, there is a 1/3 chance of those things happening.
I figured their best action would be to both brings pennies. That would guarantee a chance to go see the new OJ movie.
Apr 09, 2007
|dunno..seems to me that flipping 2 coins gives you i/4 not 1/3...chance are hh.tt,ht,th|
Sep 08, 2007
|I'm on the 5/16 team. (0.3125)|
2 people, 2 choices, 4 possibilities for coins to be brought.
Using chance of occurence * chance of success we get:
1 of 4, no coins, 25% * 0% = 0
2 of 4, 1 coin. 50% * 50% = 0.25
1 of 4, 2 coins. 25%*25% = 0.625%
Sep 10, 2007
|But AKMark, you are missing that they can change the probabilities of bringing those coins.|
Sep 27, 2007
|now for anyone lost in the answer the chances of them bringing a coin or not (and thus the strategy that they should use) are represented by P which is decided by them. |
the different possibilities for bringing the coins are:
p^2 - they both bring the coin
P*(1-p) - only one brings a coin
(1-p)*P - the same as the former but the other way around
(1-p)*(1-p) - neither brings a coin
chances of winning the coin toss:
for 1 coin - 1 out of 2 possibilities to win both equally probable
for 2 coins - 1 out of 4 possibilities to win (as only 2 heads means a win)
for 0 coins - 0 chance
the calculation for them going free is therefor: each possibility of bringing coins multiplied by the chance to win the coin toss
y = P^2*(1/4) + P*(1-P)*(1/2) + (1-P)*P*(1/2) + (1-P)^2*0
y = (P^2)/4 + P*(1-P) + 0
y = (p^2)/4 + p - p^2
y = p - 3/4(p^2)
now you get a parabole that cuts the x axis in two points and is facing upwards therefor there is one maximal value which we can find in one of two ways:
a) find the two answers that equal 0
and find the average between them that will be the tip of the parabole
with the formula : (-b +/- sqr(b^2 - 4*a*c))/2*a (which i hope you know)
the answers that you get are 0 and 2 1/3 whos average is 2/3 which then you put instead of p and find that y(2/3) is 1/3 which is the best chance
b) using calculus you will look for the value of x in the derivative at which the slope of the tangent of the curve equals zero (maximal point of the parabole)
y' = -1.5p + 1
-1.5p + 1 = 0 / +1.5p
1.5p = 1 / 1/1.5
p = 2/3
which of course gives a 1/3 chance
i hope this clarifies it a bit
Nov 03, 2007
|EXCELLENT. I just joined and this is the best one I've seen.|
Nov 05, 2007
Dec 31, 2007
|I got 1/3 but I did a much easier way. In order to decide if they should bring a coin or not , just flip a coin!!!|
There are 9 possible outcomes.
no bring-no bring = lose.
no bring-bring = win or lose
bring no-bring = win or lose
bring bring = winwin winlose losewin or loselose. Only 3 of those possibilites would set them free. 3/9= 1/3. No calculus.
Jun 17, 2008
|The question needs to be reworded to make your answer correct. You said they would both act optimally. So, the first question they would answer is whick option of bringing a coin yeilds the best results. Then they will choose this option and the probability will the equated from that choice. The complex answer you gave is great, but doesn't apply since they only get to do this once and they are going to choose the optimal choice every time.|
Optimally speaking: if you don't bring a coin, your going to have either a 50% chance or 0% chance. If you do bring a coin, you will have either a 50% chance or a 25% chance.
So, they would both choose to bring a coin every time. This yeilds easily a 25% chance to go.
You have to add the logical side to this probabilty question to get the real answer.
Jun 17, 2008
|jcbamb -- This is indeed a difficult concept, but here "acting optimally" really does mean applying come random chance to make sure they will bring the coin with probability 2/3. There are not a lot of situations in our lives where we are best off by choosing a mixed strategy, so it is difficult to pick another example as a comparison, but if they each create some random number generator to make sure they bring the coin with probability 2/3, they will leave with probability 1/3, which is better than the solution you have proposed.|
Nov 18, 2008
|Judging by the above comments, this puzzle seems to be way too complicated for a layperson to understand; the concept of the siblings *randomly* choosing whether to bring a coin or not can seem nonsensical at first. However, I am no layperson, and I enjoyed seeing a more difficult problem on this site for a change! |
Feb 07, 2009
|This is a great puzzle. And you explained it without even a MENTION of John Nash. Well done. |
For you naysayers, let me just say that I know this subject and the author is completely correct. Just because you are only going to do something once doesn't mean that it is wrong to choose the action probablistically.
Sep 09, 2009
|did it like spockinasock... |
Sep 16, 2010
|A simple game theory problem, but I can see why it would be hard for those who haven't taken a course on game theory.|
Jun 01, 2011
|But Zag24, he DID mention John Nash. Not a bad thing, just pointing out that he was mentioned. ;)|
Back to Top