[Math] Does the likelihood of an event increase with the number of times it does not occur

probability

I would seem logical that the more times an event does not happen, the more likely it is to happen, for example: If a coin is flipped and it lands on tails 10 times in a row it would seam more likely that the next flip will result in heads.

The Infinite Monkey Theorem is one such idea that suggests this is true,
http://en.wikipedia.org/wiki/Infinite_monkey_theorem

It states that if some number of monkeys are left in a room with typewriters for an infinite amount of time then they will eventually compose all written texts ever produced. This seems to suggest that since the chance of the monkeys writing a work, say Shakespeare's Romeo and Juliet, is very low. The more times they do not write it, the more likely they are to write it, until the chance becomes significant and it, the writing of the play, happens.

However another idea, Gambler's Fallacy states quite the opposite.
http://en.wikipedia.org/wiki/Gambler%27s_fallacy

It states that the chance of an event does not increase with the number of times it does not occur.

So what is the answer? Does the likelihood of an event go up the more times it does not happen, or does it stay the same? And if it does stay the same then how does one explain the Infinite Monkey Theorem?

Best Answer

The Infinite Monkey Theorem does not suggest that the "more times they do not write it, the more likely they are to write it, until the chance becomes significant and it, the writing of the play, happens." Rather, what it says informally is that the longer they have been writing, the more likely they are to have written a given string. The monkey is just as likely to start with the complete works of Shakespeare from keystroke 1 as from keystroke $10^{400,000}$. However, the longer the string of successive keystrokes, the more likely any given substring can be found there. Thus, for example, the complete works of Shakespeare are much more likely to be found in the string of the first $10^{400,000}$ keystrokes than in the string of the first $10^{300,000}$ keystrokes. That's because the former is $10^{100,000}$ times as long.

Related Question