$O(1)$ algorithm for Goldbach partitions, assuming $\pi(n)$ is known for all $0<n<2n$

computational complexitygoldbachs-conjecturenumber theory

Instead of a list of prime numbers, we will be using $B_{p}(2n)$, a binary representation of the distribution of prime numbers from $3$ to $2n$. Simply writing 0 for non-primes and 1 for primes, we get something like this:
$$B_{p}(2n)=101010001010001010001$$
Now the algorythm:
$$ B_{g}(2n)=B_{p}(2n)+reverse(B_{p}(2n)) $$
And thats it.

Example: (we treat $B_{p}(2n)$ like if it's in base 10)

$$B_{g}(2n)=101010001010001010001+100010100010100010101=201010101020101020101010102$$

We now have $B_{g}(2n)$ where the $2$'s represent the Goldbach partitions.

Question: What would be the time complexity of the algorithm for finding:

A: The number of Goldbach partitions (count occurrence of 2 in $B_{g}(2n)$)?
B: The Goldbach partitions (get indexes of $2$'s in $B_{g}(2n)$)?

Tested results
I've been using this algorithm a lot lately while working on Goldbach's conjecture, and it is blazing fast. Getting $g(500000000)$ takes about 1.5 seconds. In my case i was coding in php, which have the bcadd() function that does addition for numbers up to 2 billion digits, but also strrev() to reverse the string and substr_count() to count occurrence of $2$ in the string, so i have no loop in my code.

Moreover, $B_{p}(2n)$ is easier to compute than a list of primes, because it contains a lot less information. One could use sieving over a big string "1111111111…" to switch all non-prime to $0$

Best Answer

Unless you have a way to compress your $B_p(2n)$, any algorithm that handles those long integers (from reversing, to addition, to scanning $B_g$ for a 2) has the time complexity of at least O(n), because it needs to actually access each of those $2n$ digits.

And that doesn't take into account how to actually get the $B_p$.

Related Question