I am posting this answer here because I just joined the Philosophy website on StackExchange even though I have posting and comment privileges on other StackExchange sites. The question is quoted below:
I have always been perplexed by a seeming paradox in probability that I'm sure has some simple, well-known explanation. We say that a "fair coin" or whatever has "no memory."
At each toss the odds are once again reset at 50:50. Hence the "gambler's fallacy." After 10 heads, the odds of another head are still said to be 50:50. The same after 20, 40, 80... heads.
Yet we also know that the series will converge upon an equilibrium of heads:tails. And indeed this is countable in fairly short order. The convergence appears pretty quickly.
How can both be true? Isn't there something in the physical series of tosses that "remembers"? Isn't there necessarily some slightly better chance of a tails after 10 heads?
How does logic resolve this absolute randomness in the particular events with a general law of convergence? I imagine this must be a well-known issue. I suppose it raises the larger issue of what sort of "causality" probability is.
Note that I do not know symbolic logic so, embarrassingly, formal demonstrations are beyond my ken.
There's a very simple answer that Marilyn Vos Savant wrote in her Parade Magazine column years ago. The answer is that each individual toss of a coin has a 50/50 probability, but these odds do not apply in aggregate!
No comments:
Post a Comment