Lol you guys are still going at it. The way I see it, once the game "stops", you need to create a completely new probability tree. Let's say the odds of dying from cancer this year are 1 in 1 million. Now let's say you get an early form of cancer. Are your chances of dying from cancer this year still 1 in 1 million? No. A new situation has risen and you need to create a new set of probabilities.
You're hopeless. Do yourself a favor, go beg for change, then go to Walmart, buy two cups, find a penny somewhere, put it under one of them and then shuffle. Keep on doing this and come back tell the probabilities. Maybe then a ray of hope will dawn on you and you may be able to illustrate your correct thinking without coming off as a complete A$$hole & bigot. The funny part is that even though I conceded to your initial A$$hole post, you itch to belittle me and validate yourself even further. Even funnier part is that you're a perfect example of why quants failed in August. Yet, the funniest is that you are too dense to even understand that Monty Hall PARADOX is counterintuitive and why. You may be able to plug in the numbers or memorize them, but you are obviously unable to understand the concept to save your life.
The condition you allude to, the host turning over a cup, in your view therefore changes the probabilities? As shown I have adjusted for this by eliminating the probability associated with it from consideration and narrowing the universe of possibilites by a corresponding amount. The only criticism I can think of then that you might be making is that I am wrong to carry over the relationship established between case A and case D in the previous decision tree into the reformulation of the problem. Is that what you are saying? The previous decision tree has no useful info? But that's where the 1/3 vs. 2/3 relationship was established even though subsequently modified. Despite your previous post where you said you would show me where I was wrong? This is disappointing. I went through the trouble of drawing the decision tree I was using despite the prejudgmental tone of your earlier post because I earnestly do not wish to walk around in ignorance on this matter if that is indeed the case. I did so on your request (more like a dare really) because I remembered your handle from a previous thread where you seemed to be conversing with some aptitude and thought that perhaps your claims to authority on this matter unlike some of the others here who seem to merely be reciting from Wikipedia may be justified. Now you refer me to the Wikipedia article? As I said disappointing. By the way on closer inspection, the Wiki article shows the disputed 1/6 used in one of the calculations. My point all along has been that using said ratio is justified in solving for this problem.
Wow, bigot, A$$hole....those are BIG words for someone like you who truely fails to understand ANYTHING about the problem at all. Not only do you fail to understand the correct solution which is already given to you...no, you dont even comprehend the question. Yes, the Monty Hall is counterintuitive to most, including you, BECAUSE you are unwilling/unable to update your views once new information arrives. This is why most would never make it as a trader because it is inherent in most of us that we are unwilling to admit we were wrong once new information arrives that invalidates some of our earlier trades and get out. And the same applies here. You are unwilling to update your earlier view despite the fact that you were wrong, that enough information was given to you to prove you are wrong. You just fell pray to Monty Hall's paradox, lol. But you probably dont see that either......;-) Am I now also a bigot because I pointed out what you really are?
Here's the problem laid out for those having difficulty in understanding it: Game 1: You choose Door 1: Door 1 Door 2 Door3 Goat Goat Car 1/3 prob 1/3 prob 1/3 prob Host turns over Door 2 leaving these doors Door 1 Door3 Goat car 1/3 prob 2/3 prob You switch and win Game 2: You choose Door 1: Door 1 Door 2 Door3 Goat Car Goat 1/3 prob 1/3 prob 1/3 prob Host turns over Door 3 leaving these doors Door 1 Door2 Goat car You switch and win Game 3: You choose Door 1: Door 1 Door 2 Door3 Car Goat Goat 1/3 prob 1/3 prob 1/3 prob Host turns over Door 2 leaving these doors Door 1 Door3 Car Goat 1/3 prob 2/3 prob You switch and lose You won 2 out of 3 times by switching for a 2/3 probability of winning if you switch. 1/3 prob 2/3 prob You switch and win Notice that your choice still only accounts for 1/3. Removing one door doesn't increase your odds to 50/50 because the probability that the car was behind Door 2 or Door 3 was 2/3. By removing Door 2, you know that the Door 3 now has a 2/3 probability. You don't "reassign" the probability just because one door has been removed. Secondly, whether you know for a fact or not that the host "knew" the goat would be behind the door he opened, if the car isn't behind the door opened, you have a better probability of winning by switching.
If the host knows the goat will be behind the door then switching is the best strategy (2/3 - 1/3), else it makes no difference.
I think we have all discussed this question and its solution enough and it should be clear to most (to those who still disagree I dont think any further discussion will help). To those who are interested here another question, this time much more related to finance (because some claimed the previous question has nothing to do with finance, although I still think it does...). Anyway, here we go. I have a roulette game and I offer you a game for 55 dollars. However, the roulette is biased. Black comes up 60% of the time and red 40% of the time. Also, there is no zero, so 18 black and 18 red. You win 100 dollars if you play and the roulette comes up black and you get nothing if it comes out red. a) Should you play and if yes which side? You want to pay 55 and play or you want me to pay 55 and let me play and why? b) if there were any arbitrage how would you go about profiting from it c) if you disagree with this being a fair game what would be the price for a fair game and why. Any further questions? Let the ball roll.........
Yeah...I knew I was going to eat those words about 5 minutes after I posted them. But yes, you're correct. If you don't know what the host knows, then the possibilities of the host opening a door with a car throws a wrench into the game.
I'll take a stab. Perhaps I'm thinking of this from the wrong perspective, but here's my take. Black comes up 60% of the time. Figuring the 99% confidence interval on 1000 games, the range is from roughly 55%-65% that black will come up in those 1000 spins. This also means that red has a worst case scenario of coming up 45% of the time. So, if I pay $55 per spin, it will cost me at worst $30,250. On the other hand, I should profit $45,000 on the winning spins. So I should walk away with $15,000 net profit. Of course I'm sure I'm missing something.
a) I want to pay 55 and play. I will win 60 out of 100 times for 60 * (100-55) = $2700. I will lose 40 out of 100 times for 40 * 55 = $2200. I will win $500 per 100 spins or on average $5 per spin to play. b) ? c) The fair price to play is: $60 60 (100 - x) = 40x Win 60 * (100-60) = $2400. Lose 40 * 60 =$2400. Joe.