The series $\sum\limits_{n=1}^\infty\frac1{S_n}$ diverges almost surely, where $S_n=X_1+\cdots+X_n$ for some i.i.d. random variables $X$ and $(X_n)$ such that $P(X\geqslant k)=\frac1k$ for every positive integer $k$.
The same result applies to every positive i.i.d. sequence $X$ and $(X_n)$ such that $x\cdot P(X\geqslant x)\leqslant c$ for some finite $c$, for every $x$ large enough.
A proof of this uses the degenerate convergence criterion (part 1.), which provides the convergence in probability of a scaled version of the increments (part 2.), and a lemma which allows to deduce the almost sure convergence of the series from this convergence in probability (part 3.). All this suggests a natural conjecture (part 4.). But it happens that parts 1. and 2. can be replaced by a self-contained argument, explained in addendum A.
Finally, in addendum B. we prove the much simpler fact that the expected value of the sum diverges.
1. A version of the degenerate convergence criterion (see the book by Michel Loève, Probability theory, section 23.5, page 329 (1963), or the quote of Loève's book by Allan Gut in his article An extension of Feller’s weak law of large numbers, dated 2003, as Theorem 1.2) is as follows.
Degenerate convergence criterion: Let $S_n=X_1+\cdots+X_n$, where $X$ and $(X_n)$ is i.i.d., and $(b_n)$ some positive sequence increasing to infinity. Introduce $$X^{(n)}=|X|\cdot\mathbf 1_{|X|\leqslant b_n},\qquad\mu_n=E(X^{(n)}),$$ and the conditions:
- (i) For every positive $\varepsilon$, $n\cdot P(|X|\geqslant\varepsilon b_n)\to0$
- (ii) $n\cdot\mathrm{var}(X^{(n)})\ll b_n^2$
- (iii) $n\cdot \mu_n\ll b_n$
Then, conditions (i) and (ii) imply that $b_n^{-1}(S_n-n\cdot\mu_n)\to0$ in probability,
hence conditions (i), (ii) and (iii) imply that $b_n^{-1}S_n\to0$ in probability.
2. In the present case, conditions (i) and (ii) both mean that $b_n\gg n$, and $\mu_n\sim\log b_n$ hence the choice $b_n=n\log n$ shows that $S_n/(n\log n)\to1$ in probability. Note that every choice $b_n\gg n\log n$ would show that $S_n/b_n\to0$ in probability, for example $S_n/(n\log n\log\log n)\to0$ in probability.
3. Our second tool is a general lemma which we state independently.
Lemma: Consider any sequence $(Y_n)$ of nonnegative random variables and $(c_n)$ some positive sequence such that $P(Y_n\leqslant c_n)\to0$ and $\sum\limits_nc_n$ diverges, then $\sum\limits_nY_n$ diverges almost surely.
Proof: For every event $A$ such that $P(A)\ne0$ there exists some finite $n_A$ such that, for every $n\geqslant n_A$, $P(Y_n\leqslant c_n)\leqslant\frac12P(A)$, in particular, $E(Y_n\mathbf 1_A)\geqslant c_nP(Y_n\geqslant c_n,A)\geqslant\frac12c_nP(A)$ for every $n\geqslant n_A$ hence $E(Y\mathbf 1_A)=+\infty$, where $Y=\sum\limits_nY_n$. This holds for every $A$ such that $P(A)\ne0$ hence $Y=+\infty$ almost surely, QED.
The lemma above with $Y_n=1/S_n$ and $c_n=1/(2n\log n)$ shows that $\sum\limits_{n=1}^\infty\frac1{S_n}$ diverges almost surely.
4. Roughly speaking, $S_n$ is of order $n\log n$. A natural conjecture is that $$\frac1{\log\log n}\sum_{k=1}^n\frac1{S_n}$$ converges in distribution.
A. To make this answer fully self-contained, here is a simple proof that $\frac{S_n}{n\log n}\to 1$ in probability.
First of all, note that $P(X_1>\alpha)<\frac{1}{\alpha}$ for any real $\alpha>0$. Now fix some positive integer $n$ and define the truncations
$X'_k:=X_k \mathbf{1}_{\{X_k\leqslant n\log n\}}$. Let also $S'_n:=X'_1+\dots+X'_n$ and $\mu_n:=E[X'_1]$.
Since $n\mu_n\sim n\log n$, it suffices to show that $\frac{S_n}{n\mu_n}\to 1$ in probability.
Notice that $$P\left(\left|\frac{S_n}{n\mu_n}-1\right|>\epsilon\right)\leqslant P(S_n\neq S'_n)+P\left(\left|\frac{S'_n}{n\mu_n}-1\right|>\epsilon\right)
\leqslant nP(X_1>n\log n)+P\left(\left|\frac{S'_n}{n\mu_n}-1\right|>\epsilon\right).$$
The first term is easy to bound: $nP(X_1>n\log n)<n\frac{1}{n\log n}\to 0$ as $n\to\infty$.
The second one can be bounded using Chebyshev's inequality:
$$P\left(\left|\frac{S'_n}{n\mu_n}-1\right|>\epsilon\right)\leqslant \frac{1}{\epsilon^2}\text{Var}\left(\frac{S'_n}{n\mu_n}\right)=\frac{\text{Var}(X'_1)}{\epsilon^2 n\mu_n^2}
\leqslant\frac{n\log n}{\epsilon^2 n\mu_n^2}\sim\frac{n\log n}{\epsilon^2 n\log^2 n}\to 0,$$
where we used the obvious inequality $\text{Var}(X'_1)\leqslant E[X_1'^2]=\sum\limits_{1\leqslant k\leqslant n\log n}\frac{k^2}{k(k+1)}\leqslant n\log n$.
B. The divergence of expectation is much easier to prove. For simplicity, assume that $X = 1/U$, where $U$ is uniformly distributed on $[0,1]$. Then
$$
E\left[\frac{1}{X_1+\dots+X_n}\right] = E\left[\int_0^\infty e^{-t(X_1+\dots+X_n)}dt \right] = \int_0^\infty \left(E[e^{-tX}]\right)^n dt.
$$
Now
$$
1-E[e^{-tX}] = \int_0^1 (1-e^{-t/u}) du = t\int_t^\infty (1-e^{-z}) z^{-2} dz \sim -t\log t,\qquad t\to 0+.
$$
This already implies that the expectation is infinite, since
$$
E\left[\sum_{n=0}^\infty\frac{1}{X_1+\dots+X_n}\right] = \int_0^\infty \frac{dt}{1-E[e^{-tx}]}
$$
and the integrand behaves like $\frac{-1}{t\log t}$ as $t\to 0+$.
Also this allows to write asymptotics for one summand: as $n\to\infty$,
$$
E\left[\frac{1}{X_1+\dots+X_n}\right] = \int_0^\infty \left(E[e^{-tX}]\right)^n dt \sim \int_0^\epsilon \left(E[e^{-tX}]\right)^n dt\\
\sim \int_0^\epsilon e^{n t\log t} dt \sim \int_0^{\delta} e^{-nu}\frac{du}{- \log u}\sim \frac{1}{n\log n},
$$
which agrees with the convergence of probability $S_n/n\log n\to 1$.
The sequence is almost surely non-convergent.
Assume the sequence was convergent. By the Hewitt-Savage zero-one law the limiting random variable has to be constant.
If the limit isn't zero, we can apply the continuous mapping theorem and we get that $\log(R_n)$ converges almost surely to a constant. This is a contradiction, since $\frac{\log(R_n)}{\sqrt{n}}$ converges weakly to a Gaussian distribution by the central limit theorem.
If the limit is zero, we use again that $\frac{\log(R_n)}{\sqrt{n}}$ converges to a (centered) Gaussian random variable. This implies that $P(R_n > 1) = P(\log(R_n) > 0)$ converges to $\frac{1}{2}$, in contradiction to the almost sure convergence of $R_n$ to $0$.
Edit: This can actually be simplified significantly. Since $\frac{\log(R_n)}{\sqrt{n}}$ converges weakly to a centered nondegenerate Gaussian random variable, the probability $P(n^{-1/2} \log(R_n) > 1) = P(R_n > \exp(\sqrt{n}))$ converges to a stricly positive number. But this means that $R_n$ isn't tight and therefore doesn't even have a weakly convergent subsequence.
Best Answer
The conclusion that you would prove is not true. The reason is the following. Let $X_{nk}=\frac{\sqrt{k}X_k}{n}$, $1\le k\le n$, then $\mathbb{E}X_{nk}=0$, \begin{gather} \sum_{k=1}^n\mathbb{D}(X_{nk})=\sum_{k=1}^n\frac{k}{3n^2}\to \frac16,\qquad \text{as $n\to\infty$, }\\ \max_{1\le k\le n}|X_{nk}|\le \frac1{\sqrt{n}}\to 0.\qquad\text{as $n\to\infty$, } \end{gather} Hence using Lévy-Lindeberg Theorem, $$ \text{d-}\lim_{n\to\infty}Y_n=\text{d-}\lim_{n\to\infty}\sum_{k=1}^nX_{nk} \overset{dist.}{=}N(0,1/6).$$ This also means that ''$Y_n\to 0$ a.s.'' is not true. But you could prove that $$ \frac{Y_n}{\log\log n}\to 0,\qquad \text{a.s.}$$