[Math] Better error bounds for partial sums of reciprocals of primes

analytic-number-theoryprime numbers

One of Mertens' theorems gives that

$$\sum_{ p \text{ prime,} p \leq k } 1/p – \log{\log{k}} = B + E(k)$$

where $B$ is a constant near $0.26$ in value and $E(k)$ is an error
term whose size is dominated by something close to $4/\log{k}$, when $k$
is large enough to make the sum meaningful.

I want to work with partial sums of the above with $j \lt p \leq k$,
so that I can say the partial sum of the reciprocals of primes
greater than $j$ and at most $k$ is $\log{(\log{k}/\log{j})} + E(j,k)$
where $j$ is not too small (perhaps $j \gt 3$ or $5$), $k$ not too
large, say $j^\alpha \lt k \lt j^\beta$ where often
$1 \lt \alpha \lt \beta \lt e$, and $E(j,k)$ is
comfortably small. Unfortunately $4[1/\log{k} + 1/\log{j}]$ looks too big
for me; I am hoping to have (for large enough $j$) $E(j,k)$ bounded by
something that is $O(1/j)$ or better.

I have access to Mark Villarino's treatment of Mertens' theorem. As
of 2005, it seems $E(k)$ is no better than $O(1/\log{k}^2)$
I also hope to obtain recent work of Pintz and Diamond on oscillations
ia related
formula which is Mertens product formula, but I do not see yet how
it will me help me with this formula.

As I am still a tyro at number theory, I don't even know how realistic
my hopes are for $E(j,k)$ to be $O(1/j)$. Can someone who is familiar with
the recent literature give me a guide post? Either references or
heuristics showing what sort of bounds to expect for $E(j,k)$ or even $E(k)$
would be welcome.

UPDATE 2012.07.24
I want to acknowledge the contributions of joro,
Christian Elsholtz, and Eric Naslund. joro and
Eric helped me realize that expecting $O(x^{-1/2})$
error even conditionally is expecting a bit much, and Christian
helped me realize that Dusart still has some nice
unconditional refinements. I will likely accept
Christian's answer, but not before I do some computations
of my own. In particular, Rosser and Schoenfeld have in
Theorem 20 of their 1962 paper on functions relating to primes a nice
difference of $2/(x^{1/2}\log x)$ which is valid for
$1 \lt x \lt 10^8$ between lower and upper estimates for
the sum of interest, and I may end up using or refining that
estimate in combination with Dusart's results for larger $x$ for my
own nefarious purposes. I am hoping in particular that the
oscillations will be slow enough that my desired partial sums
from $j$ to $j^\alpha$ will have very small error.
END UPDATE 2012.07.24

Gerhard "Ask Me About System Design" Paseman, 2012.07.19

Best Answer

In case you also want computationally explicit bounds look at Theorem 6.10 of:

Pierre Dusart: Estimates of Some Functions Over Primes without R.H. http://arxiv.org/abs/1002.0442

Probably the proof is for you more interesting than the Theorem itself. On the one hand side it shows (as Eric did) the connection with the error term of the prime number theorem. On the other hand, with the constants detailed in Theorem 5.2 you can construct explicit bounds of the type. If $x>x_k$, then $\vert \sum_{p < x} \frac{1}{p}- B - \log \log x \vert < \frac{\eta_k /k}{(\log x)^k} + \frac{\eta_k (1 + \frac{1}{k+1})}{ (\log x)^{k+1}} $.

The asymptotic bounds given by the error term of the prime number theorem are of course stronger. (Explicit bounds of the type that Eric mentioned exist, but are probably not useful for computations).