You can obtain a complete heuristic picture of what is going, by supposing that quadratic residues (as suggested by Laurent) are the only likely obstruction to getting the next prime. More precisely, if you let $Q_n = \prod_{k \le n} q_k$ be the product of all of the primes discovered so far, then you can let $p_{n+1} = p$ be the first remaining prime such that $Q_n$ is a quadratic residue mod $p$, assuming Denis' first proposal. Or, assuming Denis' second proposal, you can let $p$ be the first remaining prime such that either $Q_n$ or $-Q_n$ is a quadratic residue mod $p$. (Thus it will simply be the next prime if that prime happens to be 3 mod 4.) Or, assuming that you only use the sum formula, you would only check $-Q_n$. Then experimentally, $q_{n+1} = p_{n+1}$ in all three cases.
This describes the sequence that Charles obtained, which I also obtained (at least the first half of it) with a Sage program.
qseq = [2]
for z in range(16):
attain = set(); Q = prod(qseq)
prediction = [p for p in list(primes(500)) if not p in qseq and
(legendre_symbol(-1,p) == -1 or legendre_symbol(Q,p) == 1)][0]
for S in Combinations(qseq):
A = prod(S); attain.update([abs(A-Q/A),A+Q/A])
best = 10000
attain.discard(1)
best = min([min(factor(m))[0] for m in attain])
print prediction,':',qseq,'->',best
qseq.append(best)
Assuming both the sum and the difference are used, the validity of this prediction reduces to the following: If $p$ is this next prime, then the question is whether $1$ appears in the set product of the subsets $\{q_k,1/q_k\}$ and $\{\pm 1\}$ in the multiplicative group $(\mathbb{Z}/p)^*$. This is a question in additive set theory if you take a formal logarithm and pass to the additive group $\mathbb{Z}/(p-1)$. A reasonable conjecture is that you are convolving a lot of subsets that don't come close to lying in a subgroup of $\mathbb{Z}/(p-1)$, so you can expect $0$ to appear in the sum. The heuristic has such a large margin that it may be possible with existing methods to prove that it is always so.
The other part of the question is how much you deviate from seeing the primes in order. Again, using a randomness heuristic in number theory, the "probability" that a prime $p$ will be skipped at a given stage is 1/2 (assuming that it is 1 mod 4 in the second case). So you expect the first skipped prime to eventually be included with a "half life" of one round. Thus you expect the sequence to only ever be barely ahead of the list of primes, and you expect $q_n$ to be the $n$th prime infinitely often.
An update to my earlier answer. I've written a proof of this "AC0 prime number conjecture" as a short paper, which can be found here.
https://arxiv.org/abs/1103.4991
I thought a bit about establishing a nontrivial bound on the Fourier-Walsh coefficients $\hat{\mu}(S)$ for all sets $S$. My paper does this when $|S| < cn^{1/2}/\log n$ (here $S \subseteq \{1,\dots,n\}$). On the GRH it works for $|S| = O(n/\log n)$. I remarked before that the extreme case $S = \{1,\dots,n\}$ follows from work of Mauduit and Rivat.
I still believe that there is hope of proving such a bound in general, but this does seem to be pretty tough. At the very least one has to combine the work of Mauduit and Rivat with the material in my note above, and neither of these (especially the former) is that easy.
Best Answer
Let $q<Q$ be such that all its prime divisors are in $A$. We can write $q=q_0\times r_1^{n_1}\cdots r_s^{n_s}$, where all prime factors of $q_0$ are $<\log Q$ and $\log Q\leq r_1<\cdots<r_s\in A$. Clearly, $q$ is uniquely determined once we have fixed a) $q_0$, b) the set $\{r_1,\ldots,r_s\}$, and c) the integer vector $(n_1,\ldots,n_s)$.
The number of possible $q_0$ is at most the number of $q<Q$ which have all prime factors $<\log Q$. This is a well-studied quantity in analytic number theory (counting 'smooth' or 'friable' numbers), and is $\leq \exp(O(\log Q/\log\log Q))$, which can be shown via elementary methods, see any textbook on analytic number theory.
There are clearly at most $d(n,Q)$ choices for the set $\{r_1,\ldots,r_k\}$.
Finally, to estimate the number of choices for (c), note that $$\sum n_i\log r_i < \log Q,$$ and $\log r_i\gg \log \log Q$ by construction, hence it suffices to bound the number of positive integers $n_1,\ldots,n_s\geq 1$ (where $s$ may vary) such that $\sum n_i \ll \log Q/\log\log Q$. This is at most $\exp(O(\log Q/\log\log Q))$, using the elementary fact (simple combinatorics, 'stars and bars') that the the number of sequences of positive integers $n_i$ with $\sum n_i \leq M$ is exactly $2^M$.
Thus combining these three estimates, the number of possible $q$ is at most
$$ (\exp(O(\log Q/\log\log Q)))^2\times d(n,Q) \ll \exp(O(\log Q/\log\log Q))d(n,Q).$$