I see a proof in https://arxiv.org/abs/1805.11965 (equation 3.36) that uses the following.
$\log x = \int_0^{\infty} ds \left(\frac{1}{1+s} – \frac{1}{s+x}\right)$.
This seems to hinge on $\int \frac{1}{x} = \log_2 x$ (the context is information theory), as opposed to $\log_e(x)$. Why is this true?
Best Answer
The notation in the paper is a little confusing because in the classical part entropy is measured in bits and $\log$ represents the base 2 logarithm, however part 3 deals with quantum entropy, and the definition by Von Newmann uses natural log - in fact the unit of entropy when using natural logarithm has a name, nat, nit, or nepit - see e.g. en.wikipedia.org/wiki/Nat_(unit)