The Ramanujan summation for the series $\sqrt[n]{2}$

divergent-serieslimitsramanujan-summation

A Ramanujan summation is a

technique invented by the mathematician Srinivasa Ramanujan for assigning a value to divergent infinite series

In my case, I'm interested in assigning a value to the divergent series

$$\sum_{n=1}^\infty f(n) \ \ \ \ \ \ \ \text{where}\ \ \ \ f(n)=\sqrt[n]{2}$$

According to the Wikipedia page (and my understanding), the Ramanujan summation is

$$\sum_{n=1}^\mathfrak{R} f(n)=\lim_{N\to\infty}\Bigg[\sum_{n=1}^N f(n)-\int_{1}^N f(t)dt\Bigg]$$

Thus

$$\sum_{n=1}^\mathfrak{R} \sqrt[n]{2}=\lim_{N\to\infty}\Bigg[\sum_{n=1}^N \sqrt[n]{2}-\int_{1}^N \sqrt[t]{2}dt\Bigg]$$

Taking the antiderivative

$$\sum_{n=1}^\mathfrak{R} \sqrt[n]{2}=\lim_{N\to\infty}\Bigg[\sum_{n=1}^N \sqrt[n]{2}-\Bigg(\ln2\Big(\text{li}\ 2-\text{Ei}\frac{\ln2}{N}\Big)+N\sqrt[N]{2}-2\Bigg)\Bigg]$$

Moving some constants outside the limit

$$\sum_{n=1}^\mathfrak{R} \sqrt[n]{2}=2-\ln2\cdot\text{li}\ 2+\lim_{N\to\infty}\Bigg[\sum_{n=1}^N \sqrt[n]{2}-\Bigg(N\sqrt[N]{2}-\ln2\cdot\text{Ei}\frac{\ln2}{N}\Bigg)\Bigg]$$

It's at this point I'm unsure of how to proceed. I'm not terribly confident what the limit converges to. From my computational estimates up to $N=10^8$, I find that

$$\sum_{n=1}^\mathfrak{R} \sqrt[n]{2}\approx1.6$$

But due to floating point errors or slow convergence, it deviates substantially enough for me to not be confident about any more digits.

I'd like to know if this converges at all, and if it does, is there a (reasonably) closed form / relation to other constants?

Best Answer

I've not yet seen that definition of Ramanujan-summation as in your first formula.
But for testing I've tried your third formula $$ \sum_{n=1}^\mathfrak{R} \sqrt[n]{2}=\lim_{N\to\infty}\Bigg[\sum_{n=1}^N \sqrt[n]{2}-\int_{1}^N \sqrt[t]{2}dt\Bigg] \tag 3 $$ implemented in Pari/GP for high precision.

First, using W|A I've got the following expression for the integral, setting $ß=\log(2)$ $$ \int \exp( \frac ßt) dt = t\exp(\fracßt) - ß \text{Ei}(\fracßt) + const \tag {4.1} $$ and $$ \begin{align}I(N)&= \int_{t=1}^{N} \exp( \frac ßt) dt \\ &= (N\exp(\fracßN) - ß \text{Ei}(\fracßN))&-(1\exp(\fracß1) - ß \text{Ei}(\fracß1)) \\I(N) &=N\exp(\fracßN) -2 &- ß (\text{Ei}(\fracßN)- \text{Ei}(ß)) \end{align} \tag {4.2}$$ With this I come for some not too large N in the near of your found value: $$K(N) = \sum_{k=1}^N \exp( \fracßk) \tag{4.2}$$ and $$S(N) = K(N) - I(N) \tag {4.3}$$

  N       S(N)        
   100  1.60585777814
  1000  1.60273245539
 10000  1.60242047744
100000, 1.60238928520

It seems to converge, and to the value that you gave in your OP. But doing the serial summation in $K(N)$ to even higher N (to get more accuracy) is time-consuming, so I reconstruct this series such that Pari/GP can calculate this easier when $N$ is in the billions... This is the rewriting as a double series, where the expression for each $\sqrt[k] 2$ is expanded in the exponential-series on $\fracßk$

$$ \begin{array} {} \text{lhs} & =\text{rhs1} &+ \text{rhs2} \\ \hline \exp(ß/1) & = 1+ß/1 & +ß^2/1^2/2! &+ß^3/1^3/3! & + \cdots \\ \exp(ß/2) & = 1+ß/2 & +ß^2/2^2/2! &+ß^3/2^3/3! & + \cdots \\ \exp(ß/3) & = 1+ß/3 & +ß^2/3^2/2! &+ß^3/3^3/3! & + \cdots \\ \vdots & \vdots \\ \exp(ß/N) & = 1+ß/N & +ß^2/N^2/2! &+ß^3/N^3/3! & + \cdots \\ \vdots & \vdots \\ \end{array} \tag 5$$ Looking at the column-sums we see, that the first two columns $\text{rhs1}$ give divergent series, but the following columns $\text{rhs2}$ are convergent. So we operate in two parts: evaluate the RHS2 for the limit $N \to \infty$ immediately by the sum of zetas $$ \text{RHS2}(\infty) = ß^2/2! \zeta(2) + ß^3/3! \zeta(3) + ß^4/4! \zeta(4) + ... \approx 0.473841903568 \tag {5.1} $$ Since the columnsums in RHS1 diverge we reformulate the columnsums up to the N'th partial sum in immediate values in terms of $N$ and of harmonic-numbers $H(N)=\psi(1+N)+\gamma$ (or H(N)=psi(1+N)+Euler in Pari/GP): $ \text{RHS1}(N)=N+ ß \cdot H(N)$ so $$ \begin{array} {}K(N) &= N + ß \cdot H(N) &+ 0.473841903568\\ I(N) &= N\exp(\fracßN) -2 &- ß (\text{Ei}(\fracßN)- \text{Ei}(ß)) \\ S(N) &= K(N)&- I(N) \end{array} \tag {6}$$

giving convergence to the value

  N    S(N)           difference S(N_{k+1})-S(N_k)
 1000: 1.60297258955 
10^5 : 1.60239168746 -0.000580902092269
10^6 : 1.60238640626 -0.00000528119790414
10^12: 1.60238581946 -0.000000586799480429
10^24: 1.60238581946 -5.86800097238 E-13
10^48: 1.60238581946 -5.86800097239 E-25

So I think, your own approximation went into the right way.


P.s. In eq.6 the divergence can be reduced by cancellation; the exponential in $I(N)$ be expanded to $N(1+ß/N) + O(1/N)=N + ß + O(1/N)$ and the divergence in $N$ cancels with that $N$ in $K(N)$. Next, the expression $\text{harm}(N)$ can be rewritten as $\log(N)+\gamma$ for $N \to \infty$ and be put together with the $\text{Ei()}$ - expression in $I(N)$ getting $$ \lim_{N\to \infty} \text{Ei}(ß/N) + \log(N) = \gamma + \log(\log(2)) + O(1/N)$$ leading to the shortened version of eq.6 $$ \begin{array} {} \lim_{N \to \infty} S(N) &= & ß \cdot (\log(N)+\gamma) &+ 0.473841903568\\ &&- ( ß + O(1/N) -2 &- ß (\text{Ei}(\fracßN)- \text{Ei}(ß))) \\ &=& ß\gamma -ß+2 -ß\text{Ei}(ß) &+ 0.473841903568\\ &&+ ß (\text{Ei}(\fracßN)+\log(N) ) \\ &=& 2+ß(2\gamma + \log(ß) -1 -\text{Ei}(ß)) &+ 0.473841903568\\ &=& 1.60238581946 \end{array} \tag {6a}$$ where in the second-last line the divergence $N$ has been cancelled and a constant expression emerged.

Related Question