Check if a large number is prime

prime numbers

When I say large numbers I mean large like 512 (or potentially 1024 or 2048) significant bits. These numbers are so large that there are a multitude complications. For example with python, which supports arbitrarily large integers, checking remainders up to the square root of a number so large would take by my calculation a 60+ digit number of years. Even the superior performance of a language like go only shaves a few digits of this immense runtime, and go's built-in data types don't even include integers that large. Even if all the (2-3 billion) computers in the world were working together to prove a number was prime it would still take a 50 digit (or so) number of years to prove a single 512 bit prime.

None the less the security industry appears to have a copious supply of large primes numbers at their disposal for cryptographic applications, and the largest known prime number is 2^77,232,917-1 which just seems obscene.

How could the primality of such large numbers ever practically be proven?

Best Answer

This question is a bit too open ended but what you are overlooking is that primality is not only provable by the naive method of checking for factors.

For example, the method used for the prime $2^{77,232,917}-1$ is called the Lucas Lehmer Test. In fact there is an even large such prime known today via the same test.

This requires modular arithmetic, some group theory, and clever tricks to prove. Basically, primes $p$ form larger multiplicative groups $(\mathbb{Z}/p\mathbb{Z})^\times$ than composite numbers $c$ with $(\mathbb{Z}/c\mathbb{Z})^\times$. You can find out more about this via Euler's Totient Function. By testing multiplicative orders of elements of the group $(\mathbb{Z}/N\mathbb{Z})^\times$, one can bound the size of the group, and thus test the primality of $N$. These types of tests, called Lucasian tests, usually take $O(\log^2 N)$ time. The naive method you mentioned takes $O(\sqrt{N})$ time: a massive difference.

For the Lucas Lehmer Test, one uses a slightly different ring: $(\mathbb{Z}/N\mathbb{Z})[X]/(f)$ for some irreducible quadratic polynomial. The method is still $O(\log^2 N)$ and the specific ring chosen ensures the multiplicative group associated to the ring has size divisible by (for the example of $2^{77,232,917}-1$) $2^{77,232,917}$. The size being made up of known small factors allows us to deterministically (and heuristically) implement the algorithm well.

Related Question