galanter wrote:Arson Smith wrote:This reminds me of a coin-flipping problem. The odds of flipping 'heads' 100 times in a row are verrry slim. Let's say you flip 99 heads in a row, and now you're freaking out, about to take that 100th coinflip.
You think and then say to yourself "WTF? This is a damn coin, and it either does 'heads' or 'tails', and I have a 50/50 shot at either on this 100th flip, regardless of my flippin' history!"
But that is incorrect - although it's true that you have defied some wicked odds to get 99 'heads' in a row, the probability that your 100th coinflip will be 'heads' are still pretty fucking slim, because then we're still talking about you satisfying the one case out of 1267650600228229401496703205376 where all 'heads' come up.
Uh Oh...
Assuming you are using an unbiased coin, after flipping 99 heads in a row the probability of flipping heads again is 1/2.
No, and this is where I think clocker bob got off on the wrong foot with the Monty problem. He was saying that once Monty opens the goat door, that the probability is now "reset" to 50/50. So you're saying the coin, after 99 flips of 'heads' is still "reset" - whereas I still say that the odds of flipping that 100th coin 'heads' still has to be 1/2^100
Part of it is in the phrasing of the original set, I think.
Like in the Monty problem - pretend you didn't know how many doors there were... if someone said "Well Monty may or may not have opened an unknown bunch of doors, and there's two left, now pick the non-goat", then you have no framework from which to operate. But we know there are 3, therefore we can talk about odds in 1/3 and 2/3 probabilities.
With the coins - if I just said I'd flipped a bunch of coins an unknown number of times and didn't really track how many 'heads' & 'tails' then told you I was about to flip again - then you would be correct to say 1/2 chance.
But given that I set up the framework of 100 flips, and the past 99 were 'heads' then it becomes way less likely that I should expect 'heads' again... because even if getting 99 out of 100 is astronomical luck, geting 100 out of 100 is even more astronomical luck, by a power of 2.
So... yeah - I think it just has to do with the given 'frame of reference' when you discuss probability.
I'll say it again: probability is weird.