Probability – Likelihood of Random Number Being Larger Than Previous Numbers

probability

Suppose we pick a uniformly distributed number on the range [a,b]. Then we continue to pick more numbers on the same range. Let n(t) be the number of times we have found a number bigger than any previously found, after sampling t total numbers. The initial number picked is not counted.

Solve:
$$\lim_{t\to\infty}\sqrt[n(t)]{t}$$

For reference, the problem I'm trying to solve is the geometric mean of the amount of time it takes to find the next bigger number, relative to the amount of time it took for the previous bigger number. So say it finds a new best value at the 4th try, 11th try, and 29th try, I get:
$$\sqrt[3]{4*\dfrac{11}{4}*\dfrac{29}{11}}$$
which simplifies into the top equation.

Experiments seem to indicate the amount of time it takes to find the next number multiples by about 3 for each number found, but I'm curious if there might be tools to solve this analytically. Interestingly, the value seems to be the same even if I take the randomly chosen number and plug it into a function (i.e. I'm checking to see if f(x) is greater than any previously seen f(x) for random values of x).

Related questions:

  1. Is there any way to guess at the probability of finding a new biggest number?
  2. Can the geometric mean be shown to be a constant across all (or some) functions? Intuitively I feel like it should be.

I don't have too extensive a background in math, so I'm hoping this isn't an unsolved problem.

Best Answer

First let's consider a fixed $t$.

After we have picked $t$ numbers, we will almost surely have picked $t$ different real numbers, so we can assume that this is always the case without disturbing any probabilities or expectations.

Now, instead of picking $t$ numbers one by one, we could start by picking an unordered set of $t$ numbers, and then afterwards decide a random order to present them in.

Since the $t$ picked numbers are being presented in a random order (and it ought to be clear that no order is more probable than any other), it is easy to see that the $t$th number is a new maximum with probability $\frac1{t}$. On the other hand given whatever the $t$th number is, the first $t-1$ numbers are also presented in a uniformly random order, so the $(t-1)$th number was a new maximum with probability $\frac 1{t-1}$, independently of whether the $t$th one is.

What this argument shows is that the $k$th number is a new maximum with probability $\frac 1k$, and that the new-maximumness at different positions in the sequence are all independent events.

So the expectation of $n(t)$ is simply $\sum_{k=1}^t \frac 1k$. For large $t$ this sum approaches $\gamma+\log t$, where $\gamma$ is a constant known as the Euler-Mascheroni constant.

Finally un-fixing $t$, your limit ought to be (insert handwavy appeal to the law of large numbers here): $$\lim_{t\to \infty} \sqrt[\gamma+\log t]{t} = \lim_{t\to\infty} t^{\frac1{\gamma+\log t}} = \lim_{t\to\infty} e^{\frac{\log t}{\gamma+\log t}} = \lim_{t\to\infty} e^1 = e$$

Related Question