[Math] Why does {0,1,0,0,1,0,0,0,1,0,0,0,0,1…} diverge

sequences-and-series

My book says that this sequence diverges because it "takes on only two values, 0 and 1, and never stays arbitrarily close to either one for n sufficiently large." However, I don't understand why this cannot converge. My intuition says that for an infinite n, we will reach a point where there are infinitely many 0's before hitting a 1, because each repetition, we get more and more zeroes. Eventually, wouldn't we have infinite zeros, and the pattern would converge to 0?

Best Answer

Eventually we will have infinite zeros

No, you will not. The number of zeros between each pair of ones gets bigger and bigger, but it is always finite. Consider that each term in the sequence only has a finite number of terms that come before it. So if after some instance of 1, you have infinitely many zeros, how many zeros precede this 1?

Moreover, we have a definition for a real sequence converging:

A sequence $a_n$ converges to a point $a$ if for every $\epsilon>0$, there exists $N\in\mathbb N$ such that if $n\geq N$, then $|a_n-a|<\epsilon$.

So if $\epsilon=1/2$, how big must $N$ be?

Related Question