### Smart Loose Change Post of the Day

That's right, a smart post (actually several of them) at the Loose Change forum. Needless to say, they're made by debunkers in the Skeptics Forum. It starts out with the famed Monty Hall problem:

A game show host hides a prize behind one of three doors. The contestant has to guess which door hides the prize. The rules of the game are as follows.

Firstly, the contestant chooses a door and tells the host this is the one she thinks the prize is behind. The host must then open one of the other doors. Of course, the host does not want to reveal the whereabouts of the prize so he always opens a losing door.

The host then asks the contestant if she would like to stick with the door she originally chose or switch to the other unopened one.

Should she switch doors?

Of course, the answer, (counterintuitively) is that she should indeed switch doors. Calcas and A Very Sly Denial do a terrific job of schooling PDoh:

For instance, let's take just one of your "statistically significant" events. You say that it's statistically significant that they managed to hit 75% of their targets, but you are assigning importance as if that was the intention. What if they had hit 100%? Then you'd say "what are the odds of hitting all of the targets!". What if they were completely thwarted? Then you'd say "what are the odds that the gov't could have stopped every attack? You are assigning importance after the fact.

Exactly! For example, they say, "What were the odds that all 19 hijackers would get into the country?" But of course we know they tried and failed to get at least one more hijacker into the country.

## 34 Comments:

That is a famous example, my stats professor used it in class last year. In fact it is in his textbook.

Conspiracy theorists and logic, the twain will never meet.

For those who haven't already seen it, I strongly recommend Mike King's superb paper, "Good Science and 9-11 Demolition Theories." An outstanding job, one the liars will certainly ignore.

http://www.jnani.org/mrking/writings/911/king911.htm

That is a famous exampleAnd I still think the Monty Hall problem is a bunch of crap. I know you're "supposed" to change doors, but I am of the opinion it is of no relevance whether you do or not.

Yes, the door you chose had a 33% chance of being right when you picked it, and the other door has a 50% of being right now. But this ignores that fact that the door you initally chose ALSO has a 50% chance of being right, now that one door has been eliminated. It's a dependent event; other events in the game caused its probability to change. Comparing the probability of Door #1 now vs Door #2 before the game started is an invalid comparison.

If you were playing seven-card stud (with no jokers), and you had three aces, your probability of being dealt a fourth ace would be one divided by the number of unknown cards. If that fourth ace is dealt face-up to another player, is your probability of making four of a kind the same? Of course not. But that's the way this problem wants you to think.

Put another way: suppose you didn't pick one of the three doors until after one of the doors was revealed not to be the winning prize. Would your choice matter? Of course not. So why does it matter when you made an arbitrary choice previously with less information?

I welcome anyone who wants to set me straight on this. But I warn you that some highly-educated people have tried and failed. :)

Hehe. I've argued the exact same point in the exact same manner before, and nobody seems to be able to explain just why the odds for one door should change, while the odds for the other door remain the same.

In other words, when I chose my initial door, the chances of ANY door being the right one is 1/3. Now, when door #3 is eliminated by the host, the "correct answer" wants me to assume that the odds of door #2 being the correct door have jumped to 2/3 while the odds of door #1 being the correct door have stayed the same. Why? It seems very arbitrary. You could just as easily argue that your original door now has a 2/3 chance of being the correct one, while door #2 remains at 1/3. Neither argument would make any sense. You only have two doors now, therefore the odds of either one being correct are 50%. The other explanation just doesn't make any sense.

This is from Wikipedia

Once the host has opened a door, the car must be behind one of the two remaining doors. Since there is no way for the player to know which of these doors is the winning door, many people assume that each door has an equal probability and conclude that switching does not matter. However, as long as the host knows what is behind each door, always opens a door revealing a goat, and always makes the offer to switch, opening a losing door does not affect the probability of 1/3 that the car is behind the player's initially chosen door. As there remains only one other door, the probability that this door conceals the car must be 2/3 (1 - 1/3). The "equal probability" assumption, whilst being intuitively seductive, is actually incorrect; switching increases the chances of winning the car from 1/3 to 2/3.

Yes, the door you chose had a 33% chance of being right when you picked it, and the other door has a 50% of being right now. But this ignores that fact that the door you initally chose ALSO has a 50% chance of being right, now that one door has been eliminated. It's a dependent event; other events in the game caused its probability to change.No, his choosing a door does not effect the odds of your door, you door is 1/3rd regardless. How can he change the odds of your door by simply revealing other information? That is really all this problem is, by revealing a door that doesn't have the prize he is giving you more information that you had to begin with, so by using this information you increase your odds.

Why? It seems very arbitrary. You could just as easily argue that your original door now has a 2/3 chance of being the correct one, while door #2 remains at 1/3.The secret lies in the host's knowledge. He's going to open a non-car-granting door 100% of the time, according to the definition of the problem. AND he's going to open your door 0% of the time. It's that second part of the equation which tips the odds in the direction of the non-chosen door. Never ever will Mr. Hall say, "You choose door number 2, let's look behind that first. Oh, I'm sorry, but enjoy your turtle wax."

I had a tough time accepting this myself, even after the math had been demonstrated to my satisfaction. I'd encourage you to gin up a simulation and see how it works out. The thing to remember when doing it is that anytime a "random" choice indicates that your first-chosen door will be revealed first, that trial has to be thrown out.

Triterope, I think I can clarify the Monty Hall problem. The key is that the host knows the correct door. If he doesn't, then your initial choice is no worse than any other.

What makes the problem so tricky is that it involves only three doors. Suppose it involved ten, or better still, a hundred doors. Now, after you've chosen, Monty opens ninety-eight doors and asks if you want to stick with the one you picked or switch. Obviously he is saying, in effect, you can keep your original door or EXCHANGE IT FOR THE OTHER NINETY-NINE. His opening the doors makes it confusing. If he left them closed and offered all 99 in return for the one you have, you'd jump to do it. Nevertheless, that's exactly what he's doing.

You see? You're getting the opportunity to take a 99% chance instead of a 1% chance. Somehow, this logic seems less clear when only three doors are involved, but the principle remains: you can keep your door for a 33% chance or switch for a 67% chance.

I hope that helps.

Pomeroo - I understood the Monty Hall problem quite well before your explanation, but I really liked your representation of it. I never thought about it that way before. Thanks!

No, his choosing a door does not effect the odds of your door, you door is 1/3rd regardless.You're right. That is exactally what the Wikipedia entry said. What is changing is the odds for the other door. because the host always knows what is behind each door and you know he is not going to open the winning door, him opening one door changes the odds for the other unchosen door to 2/3.

My pleasure, laviosier. I think that many people are misled by their intuitive sense that the odds for their original choice are fixed--which they are--and can't be improved by switching--which is wrong! If, as we both understand, the host doesn't know where the correct door is, no choice he offers is any better than your original selection. If he has knowledge, however, HIS knowledge helps YOU.

It's very counter-intuitive. I went and wrote a quick PHP script to test it out...and the funny thing is, as I was writing it, it dawned on me exactly why it works the way it does. I guess breaking it down into a programming language makes it easier to understand. Like I said, very counter-intuitive, but you guys are right. Anyone interested can check it here:

Script doing 500 permutations.

and the source code so you know I'm not cheating :)

Hit the refresh button as many times as you want, it's always the same (roughly) result. Switching would have given you the right result twice as many times as not switching.

Oh, and a hat tip to Manny for suggesting that we "gin up a simulation and see how it works out". Good advice.

Pomeroo,

Yes, thanks for the explanation.

It does work for the three doors too. If Monty doesn't open a door, but instead tells you, "OK, you picked door A, but you can right now trade for BOTH Doors B&C" I think anyone would trade. That is in effect what he is doing.

It doesn't matter if Monty knows where the car is or not.

I've never seen it so well explained. Thanks again.

titerope wrote:

Yes, the door you chose had a 33% chance of being right when you picked it, and the other door has a 50% of being right now. But this ignores that fact that the door you initally chose ALSO has a 50% chance of being right, now that one door has been eliminated. It's a dependent event; other events in the game caused its probability to change.Here's my take on adjusting your analysis. When the host opened the bogus door, he didn't give you any information about

yourdoor. From that point of view, all Monty did was to confirm that at least one of the other doors held a bogus prize, but you already knew that.I also recommend a simulation, but no one should let the computer programming scare them away. All it takes is pencil and paper, and not very much of that, to see why it is. I guarantee that anyone will be convinced using only the trials that you could fit on the back of a business card.

I've gotten to this late, and an excellent job has already been done.

This is very tough to explain to someone, especially if they haven't taken a lot of statistics. I usually use pomeroo's approach of expanding the question to many more doors because it makes it easier to understand, but I never thought of eliminating the "opening doors" part of it for simplification. That is a great suggestion, and one that I'll use in the future.

By the way, how many discussions of actual math like this do you see over at Loose Change? I don't mean the rhetorical "what are the chances?" which aims to slyly suggest a coincidence is a conspiracy, but someone who actually knows statistics dealing with questions of correlations/causations, data mining, compound probabilities, etc. In a sea of strong candidates, I always find their math to be some of the most obviously bad parts of their work.

Curt cameron slips into the same error that undid so many math professors who wrote Marilyn vos Savant to assure her that she was wrong. When Monty SELECTS (no, jujigatami, it makes a huge difference whether Monty knows the correct door or not) an incorrect door from a group of three, he gives you considerable information: he's telling you that your original pick is only half as likely to be the right one as the remaining door.

Again, think of a hundred doors. There are two sets: a larger one consisting of ninety-nine doors, all the wrong choices, and a smaller one, consisting of the single correct choice. Initially, you chose a random door, giving you a one-in-a-hundred chance of being right. Monty is allowing you to exchange your door for the other ninety-nine. As I said, opening ninety-eight of them just makes it a bit more confusing. He could just as easily leave them closed and give you the straightforward choice of one or ninety-nine.

If, after seeing ninety-eight wrong doors opened, you now imagine you have an even chance of holding the right door, well, try betting on the outcome. You'll soon notice that you're losing ninety-nine times for every time you win. Your original choice, clearly, had a much smaller chance of belonging to that set consisting of the one right door than the remaining door of a group of all the others.

Here's a cute little game that some of you will find easy, while others might enjoy.

Imagine that I have three two-sided cards. One is white on both sides; another is red on both sides; the third is white on one side and red on the other. I drop them into a hat and pull one out, concealing one side from both us. I show you a red side, which is the only side I can see as well.

Now, I offer to lay you odds of 7-5 that the other side is also red. You think to yourself: Hmmm, there are only three cards. It can't possibly be the white-white card, so it must be either the red-red or the red-white. He's laying odds on a proposition that appears to be even money. Hell, yeah, I'll take that bet.

Did you act wisely in snapping up my offer?

This comment has been removed by a blog administrator.

I made an error in attempting to correct jujigatami. I had a sneaking suspicion that I read his post hastily and it turns out that I did. He is saying, correctly, that if Monty offers to let you take all the other doors in exchange for your original choice, it doesn't matter whether he knows the right door or not. I misunderstood him. He gets the idea.

Pomeroo, what you're describing with the red and white cards sounds like a Bayes Theorem problem. Is that correct? Is that how I should be looking at this whole Monty Hall thing?

This comment has been removed by a blog administrator.

Triterope, my hint for the three-card problem is, Don't overthink it--just count up the possibilities.

The Monty Hall problem can summarized by stating that you should switch because you're giving up one door in return for several.

Well, the catch is that there are two red cards, but three red sides.

If we label the card sides as follows:

R1/R2 - red/red card

R3/W1 - red/white card

W2/W3 - white/white card

When you are shown a red side, it could be R1, R2, or R3. Both R1 and R2 have a red back, while R3 has a white back. So there is a 2/3 chance that the player will be shown a red side with a red back, making your 7:5 offer a bad bet for the player.

Well done, triterope!

OK, but how does that apply to the Monty Hall mystery? Is there some choice you're presented with that looks like 2 but is actually 3?

Because I still can't figure out why one probability changes, but not the other. Everyone's answers were very clear and well-written, and I understood them, I just can't seem to make the leap to apply it to the problem.

Let me ask this question: if you ALWAYS choose one of three doors, and Monty ALWAYS reveals one of the three doors as being the loser, why does your original decision matter? The way I see it, you're always going to be left with a 1/2 choice.

I've always been fascinated by this problem, but it's just soooo counterintuitive to me. And I'm the kind of guy who likes to walk up to video poker machines in Las Vegas and calculate the house edge in my head.

But I just can't fathom how you can be presented with two doors and have a 2/3 probability of succeeding. That does not compute. I acknowledge that I may simply be wrong, but the Monty Hall problem does not square with everything else I've learned about gambling and probabilites.

Triterope, it's clear enough that you're a smart guy. This counterintuitive problem tripped a number of math professors. Try this:

Instead of three doors, let's use ten. You pick a door. Monty opens eight doors, shows you eight goats, and asks if you want to switch to the door he didn't open. Now, think about it. Before he opened any doors, you KNEW there were at least eight goats behind the nine doors you didn't pick. So, showing you the goats tells you nothing you didn't already know. What he's doing, however, is GIVING YOU THOSE EIGHT DOORS IN ADDITION TO THE ONE HE DIDN'T OPEN. You see?

When you started, you had a one-in-ten chance of guessing the right door. He's allowing you to exchange it for a nine-in-ten chance. Again, pretend that he doesn't open any doors. He is simply offering all nine remaining doors in exchange for the one you chose.

Yep. As has been explained, imagine for a minute that the host isn't opening ANY doors. Instead it goes like this:

Host: Ok, pick a dor.

You: I pick door #1.

Host: Ok, now, would you rather have door #1 OR door #2 AND #3?

Obviously if you're given the opportunity to pick two doors instead of just one, you'll jump at that chance, right? Well, that's all the question is really saying.

To put it another way, by making your initial choice, you're accepting the fact that there's a 2 in 3 chance that the prize is behind one of the doors which you DIDN'T pick. So consider them as two separate sets:

Set A is the doors you didn't pick, which have a 2/3 chance of containing the prize.

Set B is the door you've already picked, which only has 1/3 chances of being the right one.

Now, the host knows which doors don't have the prize, so he can remove one of the invalid doors from Set A. Therefore, both set A and set B now have an equal number of doors...but the odds of either set being the winner haven't changed. Set A still has a 2/3 chance of being correct, while set B still has only a 1/3 chance of being correct.

The part that makes it counter intuitive is that if a second player were to come in at this point and chose between the two remaining doors, HIS odds would actually be 50/50. That's why people get confused. Just to illustrate the point, I modified my original script, and you can see the new version here. I've got player 1 picking a door, then the host opens a door, and then player 2 picks one of the remaining doors. As you can tell, player two will win 50% of the time, while player 1 wins only 33% of the time. But if player 1 switched his door every time, he'd win 66% of the time. The secret lies in the fact that player 1 actually has the opportunity to pick two doors out of 3, while player 2 only gets to pick one door out of two.

I hope that made things clearer, although somehow I get the feeling that I'm confusing the issue more than I'm helping you :)

By jove, I think I've got it. The missing link in my head was that MONTY REVEALING A LOSING DOOR DOESN'T TELL YOU ANYTHING ABOUT YOUR DOOR. (Sorry to use all caps, but finally getting thourgh this impenetrable problem is a serious Eureka for me.)

He's not going to reveal your door, whether it is correct or incorrect. So you learn nothing about the rightness or wrongness of your choice when he reveals one of the losers. But you do learn that the non-picked door he doesn't reveal has a 1/2 chance of being the winner... while the chance of the original door stays at 1/3 because the revelation is not meaningful to the door you picked. Wow, I think I actually get it now.

But Monty's prior knowledge really is the key. If the first door is RANDOMLY chosen, and JUST HAPPENS not to be the winner, then the two remaining doors WOULD be 50/50. But because Monty makes an informed choice to reveal a losing door, it changes the odds of the non-selected non-revealed door. Quite the mindbender. I need to go drink a beer.

If any of you are still up to it, Cecil Adams has a discussion of the problem here, though he does venture off into some new territory.

(The Straight Dope rules.)

This comment has been removed by a blog administrator.

Triterope, I'm not trying to ruin your day, but you haven't quite got it yet. 1/3 added to 1/2 does not equal 1. The probabilities must add up to certainty. When Monty offers to let you switch to the door he didn't open, you're getting a 2/3 chance, not 1/2.

Again, the door you originally picked has a 1/3 chance of being the right one. The other two have a 2/3 chance. Whether he shows you a goat or not, those two doors have a 2/3 chance of containing the car. You already know that at least one goat is behind those two doors. You learned nothing when he displayed it. The choice is still, stick with your 1/3 chance or exchange it for a 2/3 chance.

1/3 added to 1/2 does not equal 1. The probabilities must add up to certainty.Understood.

When Monty offers to let you switch to the door he didn't open, you're getting a 2/3 chance, not 1/2.2/3 because he's giving you the door he revealed, plus the one he didn't reveal. Check.

By George, I think you've got it!

I think I do. Wow, I never understood the problem before, but I do know. Thanks to everyone who commented. If you're ever in Nebraska, I'll buy you a beer.

Post a Comment

<< Home