Bugger
This don't happen often, so sit up straight and listen up. I was wrong.
There, done. Admitted. I was as wrong as a Michael Jackson sponsored créche. The answer to yesterdays problem (thanks to Wikipedia) is basically:
Do the player's odds of getting the car increase by switching?I am happy to accept it - but it still feels wrong for some reason. In terms of my coin throwing analogy - Wikipedia puts the boot into me there as well:
The solution
The answer to the problem is yes; the chance of winning the car is doubled when the player switches to another door rather than sticking with the original choice.
There are three possible scenarios, each with equal probability (1/3):
* The player picks goat number 1. The game host picks the other goat. Switching will win the car.
* The player picks goat number 2. The game host picks the other goat. Switching will win the car.
* The player picks the car. The game host picks either of the two goats. Switching will lose.
In the first two scenarios, the player wins by switching. The third scenario is the only one where the player wins by staying. Since two out of three scenarios win by switching and each scenario is equally likely, the odds of winning by switching are 2/3. In other words, a player who has a policy of always switching will win the car on average two times out of the three.
The problem would be different if there were no initial choice, or if the game host picked a door to open at random, or if the game host were permitted to make the offer to switch more often (or only) depending on knowledge of the player's original choice. Some statements of the problem, notably the one in Parade Magazine, do not explicitly exclude these possibilities. For example, if the game host only offers the opportunity to switch if the contestant originally chooses the car, the odds of winning by switching are 0%. In the problem as stated above, it is because the host must make the offer to switch and must reveal a goat that the player has a 2/3 chance of winning by switching.
Another way of getting the solution is that assuming you will switch, the only way of losing would be by originally picking the winning door (i.e. you initially bet that you'll find the prize; if you did pick the winning door, switching will make you lose). By switching, you essentially invert your chances from 1/3 to 2/3 (i.e. by switching you actually bet on not having chosen the winning door in the first pick).
The most common objection to the solution is the idea that, for various reasons, the past can be ignored when assessing the probability. Thus, the first door choice and the host's reasoning about which door he opens are ignored. Because there are two doors to choose from, there is then a fifty-fifty chance of choosing the right one.Apologies KiwiGirl (damn that hurts!).
Although ignoring the past works fine for some games, like coin flipping, it doesn't work for all games. The most notable counterexample is card counting in some card games, which allows players to use information on past events to their advantage. Past information helps also in the Monty Hall problem.
<< Home