Back by popular demand no demand whatsoever:
The Monty Hall Problem
Three Doors. One of them is good (it leads to underwear models, funnel cake, etc.) and two are bad (alligators, doctoral theses). You pick one, but before we show you whether the door you picked is good or bad, we generously eliminate one of the bad doors from the other two. Now it's time to make your final decision: Switch to the only remaining door, or stay with your original pick. Should you switch or stay? Does it matter?
Coin Flip Game
You and I flip a coin at the same time, and continue flipping at the same pace until the game is over. You win as soon as you get two heads on consecutive flips. I win as soon as I get a head and a tail, in that order, on consecutive flips. (We tie if both things happen on the same flip.) Who, if anyone, has better odds?
Unlike last week, intuition initially suggests simple and uninteresting answers to both puzzles - "it doesn't matter which door" and "no one has better odds", respectively. Of course, those intuitive answers are not correct.
The Monty Hall Problem
SOLUTION
The intuitive (but wrong) answer, "it doesn't matter", appears to make sense because (1) there are now just as many good doors as bad doors, and (2) you still don't know if your original pick was good or bad. (1) and (2) are both correct assumptions, but the actual answer is that if you switch doors, you'll pick the good door twice as often.
Think of it this way: When you originally picked your door, you had a 1/3 chance of being right. Nothing has changed that fact. When I eliminated a bad door, I was able to do so independent of whether your original pick was good or not. So now there's still a 1/3 chance that your original door is good, and a 2/3 chance that the good door is somewhere else. But "somewhere else" now consists of one option instead of two.
Coin Flip Game
SOLUTION
SOLUTION
The intuitive answer is that the game is fair, because in any two consecutive flips, we both have a 25% chance of success, right? In reality, my odds of winning on a given flip eventually tend toward 50%, and your odds of winning on a given flip tend toward just over 19%. The mathematics turns out to be surprisingly complicated, involving patterns related to the Fibonacci sequence and the golden ratio. Figure that out on your own if you're so inclined. If you're not, here's a partial explanation of why the game isn't fair.
What are the chances of success on a given round? If your last flip was heads (good news for you), it's 50%. If your last flip was tails (bad news for you), it's 0%. Same for me. This logic holds as far as the second round. But if it's a later round and we're still playing, that means that nobody won on the previous round, and we can deduce a little more:
You didn't win on the previous round, and you needed HH, so your previous two flips could have been HT, TH, or TT. That's: one scenario in which you might win on your next flip, and two in which you can't.
I didn't win on the previous round, and I needed TH, so my previous two flips could have been HH, TH, or TT. That's two scenarios in which I might win on my next flip, and one in which I can't.
When you consider more and more previous flips, it gets increasingly complicated and the odds skew more and more in my (HT) favor. Here is a partial table of the odds of getting the second of two desired flips on a given round.
For example, if we make it to the 6th round of flipping (meaning you haven't yet made two consecutive heads) then there is a 38.4% chance your most recent (5th) flip was heads, and thus a 19.2% chance you'll get your second consecutive head on that 6th flip.
4 comments:
Thanks for that, Aaron. I used the Monty Hall example in my college algebra class today.
Mike W.
The Monty Hall problem has consistently troubled me. I've read the explanation before and I still think it doesn't hold up. Here's why I don't buy it.
Point In Time Ichiban: Three choices exist, one of which is correct. The odds of choosing the correct choice is one in three; the odds of choosing the incorrect choice is two in three.
Point in Time Niban: Two choices exist, one of which is correct. The odds of choosing the correct choice are one in two; the odds of choosing incorrectly are one in two.
I have two counterarguments, then, to the Monty Hall Theory.
Counterargument Ichiban: At point in time Number Two, the odds of choosing the right door and the odds of choosing the wrong door must add up to one, since those are the only possible outcomes. The Monty Hall Theory holds that choosing the other door has a 1/2 probability of being correct, while the original door retains a 1/3 probability of being correct. These probabilities add up to 5/6.
Counterargument Niban: At point in time Number Two, the new probabilities calculated—1/2 and 1/2, which I think we can all agree are accurate quantifications—represent two choices. Monty Hall argues that choosing the new door allows you to take advantage of this new, higher probability of rightness, but to stick with your original door still constitutes a choice. You have, in effect, made a new 1/2 decision; you just decided on the same door you chose in another, probabilistically-unrelated choice. In the fifty-fifty situation presented at point in time Number Two, the choice is between Door 1 and Door 2, not New Choice and No Choice as the Monty Hall Theory suggests.
Does this make sense? Am I like my friend Pete who, our freshman year in college, insisted that if you've flipped a coin heads nine times in a row the odds of it coming up heads again is 1 in 2^10? Help me.
Dan, I will kindly refute your counterarguments:
You say "Two choices exist, one of which is correct. The odds of choosing the correct choice are one in two; the odds of choosing incorrectly are one in two."
Just because there are two choices doesn't mean they have equal probability. E.g., My brother and his wife live in a house. One person in the house is pregnant...
This assumption is exactly the logical flaw that keeps Monty Hall disbelievers disbelieving. At "Point in Time Niban", your original door has a 1/3 probability of "rightness" and the other door has a 2/3 probability of rightness. The denominator of your probability at every point in the problem, regardless of how many doors remain, must always be 3, representing the original 3 possible locations of the good door, before anybody started picking or eliminating doors.
Similarly, if I asked you to roll a die and perform actions based on the number showing, the probability of each action would have a 6 in the denominator. You start your calculations with the number of equally possible original scenarios.
Call the doors A, B and C, and let's say you picked A. We'll examine 3 cases, representing the possible locations of the good door, which have equal probability.
Case 1: A is the good door.
The all-knowing entity eliminates one of the bad doors (in this case the entity could pick either one, it doesn't matter). The door you originally picked is good and the other one is bad. You should NOT switch.
Case 2: B is the good door.
The all-knowing entity has promised to eliminate a bad door from among the two you did not pick. Its only choice is C, since you picked A and it can't eliminate B, the good door. You are left with A and B and you should switch.
Case 3: C is the good door.
Same story as Case 2, with B and C now trading roles. You should switch.
Of the 3 cases, 2 suggest I should switch doors, so the other door has a 2/3 probability of rightness.
Notice, the wrong logic would proceed like so:
Case 1: You stay with your original choice
Case 2: You switch to the other door
That's where most people get hung up and calculate probabilities with 2 in the denominator. But that very assumption is the same thing as making the conclusion that the doors are equal. The logic is circular: "They're equal because they're equal."
The whole trick of the Monty Hall Problem is that you have some information about the doors, but not all of it, which is hard to grasp since there's so little information to be had in the first place. The information you have is that the remaining door is probably good because the all-knowing entity chose to eliminate the other one.
If all that doesn't convince you, just grab a friend and keep data on a few trial runs. Unlike your friend Pete, you won't have to try it some 2^10 times to convince yourself of the results.
Let me see if this helps....
You have 3 choices. X X and O. You want an O
In your first choice the probability of you choosing the O is 1 in 3. Where as choosing the X is 2 in 3.
So, assuming you picked the X, as the likelihood that you have is 66%. When the other X is revealed, it is in your best interest to switch doors, statistically speaking. If you keep your same door, there is only a 1 in 3 chance you had picked the right door originally. So if you stay with your choice you have a 33% chance of winning, if you switch you have 66% chance of winning. It would seem like 50/50 but really, no because you never had two choices, and you still do not have two choices, there are always three.
The crucial point to understand and absorb is this: After you pick your first door, there will always be at least one X left to remove, regardless of the choice you made. So they are not eliminating a incorrect choice, they are just narrowing the field for you to choose from. But there are still, 2 x and only 1 O that you originally chose from.
Post a Comment