[Math] Slick proof related to choosing points from an interval in order

alternative-proofco.combinatoricsnt.number-theory

Choose a point anywhere in the unit interval $[0, 1]$. Now choose a second point from the same interval so that there is one point in each half, $[0, \frac12]$ and $[\frac12, 1]$. Now choose a third point so that there is one point in each third of the interval, and so on. How long can you choose points like this?

In other words, what is the largest $n$ so that there exists a (finite) sequence $(a_i)_{i=1}^n$ so that for all $k$, among the first $k$ points $(a_i)_{i=1}^k$, there is at least one in each interval $[0, \frac1k], [\frac1k, \frac2k], \ldots, [\frac{k-1}k,1]$?

My question

Is there a slick proof that $n$ is bounded? (Note that by compactness, the existence of such a sequence for all $n$ is equivalent to the existence of an infinite sequence with the same property.)

Backstory

I was recently re-reading old columns of Martin Gardner's, and he describes this problem in "Bulgarian solitaire and other seemingly endless tasks". (Original column 1983, available in his collections "The last recreations: …" and "The colossal book of mathematics".) He says that the problem first appeared in "One hundred problems in elementary mathematics" by Hugo Steinhaus. There, he shows that there is a sequence of length 14, but there is none of length 75. The first published proof of the longest sequence, which has length 17, was by Elwyn Berlekamp and Ron Graham in their paper "Irregularities in the distributions of finite sequences" in the Journal of Number Theory. This was followed by a shorter proof by Mieczysƚaw Warmus in "A supplementary note on the irregularities of distributions" in the same journal.

Now, these proofs mostly use case analysis to some degree or other. They are of varying complexities, oddly with Warmus's proof of the optimal $n$ being the shortest. It's also not too hard to write a computer check oneself that finds the optimal $n$. However, I feel that because of the elegant nature of the problem, there should be some "nice" proof — not of optimality, but simply that such a sequence can't continue forever. Can anyone find one?

Technical note: The problem is usually stated with half-open intervals. I made them closed so that compactness worked. (Edit: The possibility that this changes the answer did occur to me. I assume it doesn't, and I'll check by computer soon. I am fine with answers for any kind of intervals — open, closed, half-open.)

Best Answer

The proof given by Steinhaus (following A. Schinzel) is slick indeed. I'll paraphrase what he does:

At step number $n$, the interval (0,1) is divided into $n$ intervals $I_0^{(n)}=(0,\frac{1}{n}), \ldots, I_{n-1}^{(n)}=(\frac{n-1}{n},1)$. Then $\lfloor nP \rfloor$ gives you the number of the interval to which a point $P$ in the sequence belongs in the $n.$ step.

He proceeds by proving the following claim:

If $P_1 \in (\frac{7}{35},\frac{8}{35})$ and $P_2 \in (\frac{9}{35}, \frac{10}{35})$, then there is some $n \in \mathbb{N}$ big enough such that $\lfloor (n-1)P_1 \rfloor < \lfloor nP_1 \rfloor$ , but $\lfloor (n-1)P_2 \rfloor = \lfloor nP_2 \rfloor$.

Let's see how this claim implies that there is no infinite such sequence. Eventually, in your sequence at the $k.$ step there will be two points $P_1,P_2$ as in the claim. Let $n$ be as in the claim, in particular $n>k$.

Let $k_1:=\lfloor (n-1)P_1\rfloor$ and $k_2:=\lfloor (n-1)P_2\rfloor$. Then $P_i$ belongs to the $k_i$. interval in the $(n-1).$st step and there are $k_2-k_1-1$ points of $a_1, \ldots, a_{n-1}$ of the sequence in-between. But in the $n.$ step, $P_1$ will move to the $k_1+1$.st interval, while $P_2$ will remain in the $k_2$.st interval, and so there will be a ''collision'' of the in-between points.

Related Question