Will these geometric means always converge to $1/e$

analysiscomplex-analysislimitsnumber theoryreal-analysis

Let $p_n$ be the $n$-th prime and $F_n$ be the $n$-th Fibonacci number. We have

$$
\lim_{n \to \infty}\frac{(p_1 p_2 \ldots p_n)^{1/n}}{p_n}
= \lim_{n \to \infty}\frac{\{\log(F_3)\log(F_4)\ldots \log(F_n)\}^{1/n}}{\log(F_n)}
= \frac{1}{e}
$$

The first limit was proved by Sandor and Verroken using the prime number theorem and the Chebyshev function. The second limit was proved by Farhadian and Jakimjuk using the Binet's formula for Fibonacci numbers and Stirling's approximation for factorial.

Although these two results were proved using different ingredients, their structure is exactly similar which led me to investigate if there is a stronger phenomenon governing such results. My analysis led me to the following.

Claim: If $a_n= n^{1+o(1)}$ is increasing then, $ \lim_{n \to \infty}\dfrac{(a_1 a_2\ldots a_n)^{1/n}}{a_n} = \dfrac{1}{e}.$

I believe that I have a proof for this using Weyl's Equidistribution Theorem. I am looking for a simpler or a more elementary proof of this. Also have I got the conditions right for this claim to hold?

Best Answer

The logarithm of the ratio is $\overline{\log f(a_n)}-\log f(a_n)$ where the overline denotes the arithmetic average from $1$ to $n$.

For $f(a_n)=n^{1+\epsilon}$, it tends to

$$-(1+\epsilon),$$ so that the condition $f(a_n)=O(n^{1+\epsilon})$ is not the right one.

Generally speaking, there is no reason that $\overline{x_n}-x_n$ tends to $-1$ for $x_n$ slowly growing. Take $x_n=1-\dfrac1n$, for example, and the claim is false.


If I am right, a correct and simpler claim could be

$$\log(a_n)\sim\log(n)\implies\frac{(a_1\cdot a_2\cdots a_n)^{1/n}}{a_n}\to\frac 1e$$ or in other terms

$$\frac{\log(a_1)+\log(a_2)+\cdots\log(a_n)}{n}-\log(a_n)\to-1.$$

This is obviously true for $a_n=\log(n)$, but we have to prove that additive terms cannot introduce a finite bias.


Let us assume that

$$\log(a_n)=\log n+o(\log (n)).$$

Then

$$\overline{\log(a_n)}-\log(a_n)=\frac{(n\log n-n)+o(n)+n\,o(\log(n))}n-\log(n)-o(\log(n)) \\=-1+o(\log(n))$$ and it is possible that the sequence diverges.


Anyway, I failed to find a suitable term. For instance, with

$$\log(a_n)=\log(n)+\log(\log(n)),$$ we have, by the Euler-Maclaurin summation formula

$$\sum_2^n{\log\log(k))}\sim\int_2^n\log(\log(x))\,dx+\frac12\log(\log(n))\sim n\log(\log(n))-\frac n{\log(n)}+\frac12\log(\log(n)),$$

so that the residue is

$$\frac{-\dfrac n{\log(n)}+\dfrac12\log(\log(n))+\cdots}{n},$$ which tends to zero.