[Math] How strong is this conjecture? $(Z/nZ)^*$ is generated by “small” elements

computational complexitygr.group-theorynt.number-theoryreference-requestriemann-hypothesis

Conjecture: There are constants $c,k$ such that every $(Z/nZ)^*$ is generated by its elements smaller than $k (\log n)^c$.

Where $(Z/nZ)^*$ is the multiplicative group of integers mod $n$. My main question is: How "strong" is this conjecture relative to other unsolved conjectures? How "hard" do experts think this is to prove (or disprove)? Any references, summary of what is currently known, and further reading would definitely be appreciated too.

I think this is interesting on its own, but my particular motivation comes from the Miller–Rabin probabilistic primality testing algorithm. If the above conjecture is true for any constant $c$, then we can "derandomize" Miller–Rabin into a deterministic, polynomial-time algorithm for primality testing (because if $n$ is composite, a set of generators for $(Z/nZ)^*$ must contain a "witness", so we need only check a polynomial-sized set to ensure a correct answer). This would not be a big breakthrough in that sense because we already have the AKS test, but would be interesting.

The Wikipedia page for Miller–Rabin notes that the Generalized Riemann Hypothesis implies the conjecture is true for $c=2,k=2$. But I'm curious if this conjecture seems e.g. as hard as the Riemann Hypothesis, or much weaker.

For those familiar with computational complexity, an interesting conjecture would be that $(Z/nZ)^*$ has a set of generators that is producible in time $poly(\log n)$ (thus also implying it has size $poly(\log n)$). Any generic pseudorandom generator used to prove BPP=P would seem to "almost" prove this—it would say that there exists a small set of generic deterministic inputs that are enough to run Miller–Rabin without all false positives—so if this conjecture is, say, equivalent to RH, then it makes the search for PRGs seem somewhat harder. Any thoughts on this would also be great.

Best Answer

This problem seems to lie quite deep. To illustrate this consider the problem of estimating the least quadratic non-residue $\pmod p$ for a prime $p$ -- obviously if the numbers up to some point generate $({\Bbb Z}/p{\Bbb Z})^*$ then they must contain a quadratic non-residue, so this problem should be ``easier." Vinogradov conjectured that the least quadratic non-residue is $\ll p^{\epsilon}$ for any $\epsilon >0$ but this remains unknown. The best known result is that the least quadratic non-residue is $\ll p^{1/(4\sqrt{e})}$, which follows from Burgess's bound on character sums plus a multiplicative trick of Vinogradov. So unconditionally, one should not expect a result better than $n^{1/(4\sqrt{e})+\epsilon}$ for generating all of $({\Bbb Z}/n{\Bbb Z})^*$. And a result of Harman shows that the numbers up to $n^{1/(4\sqrt{e})+\epsilon}$ do in fact generate all of $({\Bbb Z}/n{\Bbb Z})^*$; see Theorem 3 of his paper.

More remarks: To prove a bound of $(\log p)^{A}$ for the least quadratic non-residue seems to require some version of a quasi Riemann hypothesis: no zeros of Dirichlet $L$-functions to the right of $1-1/A$. A precise relation between zero-free regions and the least quadratic non-residue was established first by Rodosskii. One way of thinking about the group generated by the numbers below $y$ in $({\Bbb Z}/n{\Bbb Z})^*$ is to consider the distribution of $y$-smooth numbers below $x$ in arithmetic progressions $\pmod n$ -- this is the focus of Harman's paper referenced already, and for more work in this direction see Soundararajan and Harper. The ideas here would give more relations between zero-free regions and how big $y$ has to be to generate $({\Bbb Z}/n{\Bbb Z})^*$ (along the lines of Rodosskii's work, which you can find referenced there).