I'm trying to prove the infinitude of prime numbers in a novel way.
Define a finite set of prime numbers
$$ P = \{p_1, p_2, …, p_n\} $$
Define a set $S$ that includes numbers generated using the infinite power-product combinations of the primes in $P$
$$ S = \left\{ \prod_{i=1}^{n} p_i^{a_i} : a_i \in \mathbb{N} \right\} $$
Define a gap number $G$ where $G \notin S$.
To summarize
For any finite set of primes $P$ and its associated power-product set
$S$, the existence of a gap number $G \notin S$ recursively yields a
prime not in $P$, implying infinite primes.
Example
Let $P = \{2, 3\}$.
Define $S$ as a combination of powers and products of the primes 2 and 3
\begin{align*}
S &= \{2^1, 3^1, 2^2, 2^1 \times 3^1, 2^3, 3^2, 2^2 \times 3^1, 2^4, 2^1 \times 3^2, 2^3 \times 3^1, 3^3, \ldots\} \\
&= \{2, 3, 4, 6, 8, 9, 12, 16, 18, 24, 27, \ldots\}
\end{align*}
Pick any gap number, say $G = 14$.
The prime factorization of $G$ is $2\times7$, so that $7$ is a new prime $\notin P$.
Question
Is there a way to show that gap numbers always exist?
Idea
Assuming $G$ exists, then either
- $G$ is prime, so $G \notin P$.
- $G$ is composite, then its prime factorization must contain at least one prime not in $P$.
Recursively adding a new prime factor of $G$ to $P$, shows there are infinite primes.
UPDATE
I'd like to combine all these ideas into a single succinct proof. So, as a variation of Euclid's proof, is the following an accurate summary of the proof of Infinite Primes using Gap Numbers?
Proof
For a finite set of primes
$P = \{p_1, p_2, \dots, p_n\}$,
define set $S = \left\{ \prod_{i=1}^{n} p_i^{a_i} : a_i \in \mathbb{N} \right\}$. Since the growth rate of $S$ is $O(\log^n N)^\dagger$ and $N$ grows linearly, for large $N$, there exist gap numbers $ G \notin S$. Each $G$ must have a prime divisor $\notin P$. Thus, infinite primes exist. $\blacksquare$
$^\dagger$ Chaitin, G.
If there are more rigorous ways to express this proof or a reference to Chaitin, please feel free to leave an answer. Thanks!
Best Answer
Yes, there are a couple of ways to do this. One is count the number of elements of $S$ less than or equal to some positive integer $N$ and show that as $N$ gets sufficiently large there are less than $N$ of them. In fact we have that if $\prod_{i=1}^n p_i^{a_i} \le N$ then $a_i \le \log_{p_i} N$ which means that there are at most
$$\prod_{i=1}^n \lfloor \log_{p_i} N + 1 \rfloor$$
elements of $S$ less than or equal to $N$, and this grows like $O(\log^n N)$ which is eventually much slower than $N$. I have heard this argument attributed to Chaitin. He phrases it "information-theoretically" as follows: if there were finitely many primes, then the prime factorization would be an extremely efficient way to compress a positive integer, in fact so efficient it's impossible. This argument can be used to prove a very bad version of the prime number theorem, along the lines that there are must be at least $\frac{\log N}{\log \log N}$ primes $\le N$.
Another way to do this is to show that the sum
$$\sum_{s \in S} \frac{1}{s}$$
converges, while the harmonic series $\sum_{n=1}^{\infty} \frac{1}{n}$ diverges, which also implies that $S$ can't be all of $\mathbb{N}$. I have heard this argument attributed to Euler (in a slightly different form). You can show more precisely that the sum above is the finite Euler product
$$\prod_{i=1}^n \frac{1}{1 - \frac{1}{p_i}}$$
whereas the harmonic series is, informally speaking, the infinite Euler product. This idea can be pushed a bit to show that the sum $\sum_i \frac{1}{p_i}$ of the reciprocals of the primes also diverges, which also shows that there are infinitely many primes (and this version of the argument is, I think, the one really due to Euler but I'm not sure).
It's not necessary to break into cases like this. Whether $G$ is prime or composite, it's divisible by at least one prime not in $P$ either way, by definition. Alternatively you can argue that the smallest gap number is prime.