Update: the following probabilistic argument I had posted earlier shows only that $p(1) + p(2) + \dots + p(n) = (c + o(1)) \log(n)$ and not, as originally claimed, the convergence $np(n) \to c$. Until a complete proof is available [edit: George has provided one in another answer] it is not clear whether $np(n)$ converges or has some oscillation, and at the moment there is evidence in both directions. Log-periodic or other slow oscillation is known to occur in some problems where the recursion accesses many previous terms. Actually, everything I can calculate about $np(n)$ is consistent with, and in some ways suggestive of, log-periodic fluctuations, with convergence being the special case where the bounds could somehow be strengthened and the fluctuation scale thus squeezed down to zero.
$p(n) \sim c/n$ is [edit: only in average] equivalent to $p(1) + p(2) + \dots + p(n)$ being asymptotic to $c \log(n)$. The sum up to $p(n)$ is the expected time the walk spends in the interval [1,n]. For this quantity there is a simple probabilistic argument that explains (and can rigorously demonstrate) the asymptotics.
This Markov chain is a discrete approximation to a log-normal random walk. If $X$ is the position of the particle, $\log X$ behaves like a simple random walk with steps $\mu \pm \sigma$ where $\mu = 2 \log 2 - 1 = 1/c$ and $\sigma^2 = (1- \mu)^2/2$. This is true because the Markov chain is bounded between two easily analyzed random walks with continuous steps.
(Let X be the position of the particle and $n$ the number of steps; the walk starts at X=1, n=1.)
Lower bound walk $L$: at each step, multiply X by a uniform random number in [1,2] and replace n by (n+1). $\log L$ increases, on average, by $\int_1^2 log(t) dt = 2 \log(2) - 1$ at each step.
Upper bound walk $U$: at each step, jump from X to uniform random number in [X+1,2X+1] and replace n by (n+1).
$L$ and $U$ have means and variances that are the same within $O(1/n)$, where the $O()$ constants can be made explicit. Steps of $L$ are i.i.d and steps of $U$ are independent, asymptotically identical-distributed. Thus, the Central Limit theorem shows that $\log X$ after $n$ steps is approximately a Gaussian with mean $n\mu + O(\log n)$ and variance $n\sigma^2 + O(\log n)$.
The number of steps for the particle to escape the interval $[1,t]$ is therefore $({\log t})/\mu$ with fluctuations of size $A \sqrt{\log t}$ having probability that decays rapidly in A (bounded by $|A|^p \exp(-qA^2)$ for suitable constants). Thus, the sum p(1) + p(2) + ... p(n) is asymptotically equivalent to $(\log n)/(2\log (2)-1)$.
Maybe this is equivalent to the 2003 argument from the conference. If the goal is to get a proof from the generating function, it suggests that dividing by $(1-x)$ may be useful for smoothing the p(n)'s.
The Euler sum $\sum_{n=1}^{\infty} \frac{H_{n}}{n^{q}}$, where $q$ is an odd positive integer greater than $1$, can also be evaluated using this approach. See here.
Using the integration representation $$H_{n} = \int_{0}^{1} \frac{1-t^{n}}{1-t} \, dt \ ,$$
we have
$$ \begin{align} \sum_{n=1}^{\infty} \frac{H_{n}}{n^{3}} &= \sum_{n=1}^{\infty} \frac{1}{n^{3}} \int_{0}^{1} \frac{1-t^{n}}{1-t} \ dt \\ &= \int_{0}^{1} \frac{1}{1-t} \sum_{n=1}^{\infty} \frac{1-t^{n}}{n^{3}} \\ &=\int_{0}^{1} \frac{\zeta(3) - \text{Li}_{3}(t)}{1-t} \ dt \tag{1} \\ &=- \Big(\zeta(3)-\text{Li}_{3}(t)\Big) \ln(1-t) \Bigg|_{0}^{1} - \int^{1}_{0} \frac{ \text{Li}_{2}(t) \log(1-t)}{t} \ dt \\ &= -\int_{0}^{1} \frac{ \text{Li}_{2}(t) \log(1-t)}{t} \ dt \\ &= \int_{0}^{1} \text{Li}_{2}(t) \, d \big(\text{Li}_{2}(t)\big) \\ &= \frac{\big(\text{Li}_{2}(t)\big)^{2}}{2} \Bigg|^{1}_{0} \\ &= \frac{\zeta^{2}(2)}{2} \\ &= \frac{\pi^{4}}{72}. \end{align}$$
$ $
$(1)$ https://en.wikipedia.org/wiki/Polylogarithm
Best Answer
Let's rewrite the formula, using $C_1=1/2$ as $$S_n=\gamma+\log n-H_{n-1}$$ where $$S_n=\sum_{k=1}^\infty\frac{(k-1)!C_k}{n(n+1)\cdots(n+k-1)}.$$ Use the formula $$\frac{(k-1)!}{n(n+1)\cdots(n+k-1)}=\int_0^1 x^{n-1}(1-x)^{k-1}dx$$ which is a special case of the beta integral. Therefore $$S_n=\int_0^1x^{n-1}\sum_{k=1}^\infty C_k(1-x)^k dx=\int_0^1\left(\frac{1}{1-x}+\frac{1}{\log x}\right)x^{n-1}dx.$$ When $n=1$ we get $$S_1=\int_0^1\left(\frac{1}{1-x}+\frac{1}{\log x}\right)dx=\gamma$$ by a well-known formula which can be derived from the aymptotic $$\int_t^\infty\frac{e^{-x}}{x}dx=-\log t-\gamma+O(t)$$ as $t\to0$, for the exponential integral.
By induction it suffices to consider the difference $$S_n-S_{n+1}=\int_0^1\left(x^{n-1}+\frac{x^{n-1}-x^n}{\log x}\right)dx =\frac1n-\int_0^\infty\frac{e^{-ny}-e^{-(n+1)y}}{y}dy.$$ Using the identity $$\int_0^\infty\frac{e^{-ay}-e^{-by}}{y}dy=\log\frac ba$$ for $0 < a < b$ which follows by integrating $e^{-xy}$ over the region $[a,b]\times[0,\infty)$ we get $$S_n-S_{n+1}=\frac1n-\log\frac{n+1}{n}.$$ The desired formula for $S_n$ now follows by induction.