[Math] Why believe the Elliott-Halberstam conjecture

analytic-number-theory

I have seen justifications for various conjectures in analytic number theory, e.g. the (generalized) Riemann hypothesis, the Chowla conjectures, etc. justified by a heuristics in which the relevant arithmetic functions are modeled by random variables.

In the setting of primes in arithmetic progression, these heuristics say that the error term $E(x;q,a)$ for primes in particular primitive residue class should be approximately on the order of magnitude of $\frac{\sqrt{x}}{\varphi(q)}$, which is approximately GRH. If one sums this for $q$ up to $\sqrt{x}$, then the accumulated error is approximately on the order of magnitude of $x$, and the Bombieri-Vinogradov theorem actually establishes a bound of this nature.

The Elliott-Halberstam conjecture asserts that we actually get an error bound of $\frac{x}{(\log x)^A}$ for any fixed $A$ if we sum over moduli all the way up to $x^{1-\epsilon}$. My question is whether there is a convincing heuristic model that this should be the case, since the standard one that suggests GRH doesn't seem to accommodate such large moduli.

Best Answer

The Elliott-Halberstam conjecture is an immediate consequence of Montgomery's (modified) conjecture, which proposes that if $\epsilon>0$, $\gcd(a,q)=1$ and $x>q^{1+\epsilon}$, then

$\Big|\displaystyle\sum_{\substack{n\leq x \\ n\equiv a\pmod{q}}}\log p -\frac{x}{\varphi(q)} \Big|\ll_{\epsilon} x^{1/2+\epsilon}q^{-1/2+\epsilon}$.

For a very nice heuristic for why one might believe this, see Heuristic for Montgomery's conjecture.

Related Question