The absolute value function is continuous. Continuity implies that $\displaystyle\left|\lim_{n\to\infty} a_n\right|=\lim_{n\to\infty} |a_n|$ if the limit on the left exists.
To prove that a function is continuous at a point $a$, one shows that no matter how small a positive number $\varepsilon$ is made, there is a positive number $\delta$ small enough so that if $|a-x|<\delta$ then $|f(a)-f(x)|<\varepsilon$, i.e. the distance between $f(x)$ and $f(a)$ can be made as small as you want by making the distance from $x$ to $a$ small enough. If you don't see how that can be done with the absolute value function, first consider how it might be done if $a>0$, then how it might be done if $a<0$, and finally if $a=0$.
Yesterday I said that this was Tauber's Theorem, the original tauberian theorem. It's not; Tauber's Theorem is the analogous result for Abel summability. This is Someone's Theorem. I'm going to give a proof of ST organized in what I feel is the "right" manner, deriving convergence from a sort of "maximal inequality"; then as a bonus we will see that one can prove Tauber's Theorem in the same manner, using estimates that are a little more intricate, although still straightforward.
First, your notation seems a little curious, since you start the summation at $k=1$ but include $s_0$. I'm going to start at $k=1$ as you do, but instead consider the averages $$\sigma_n=\frac{s_1+\dots+s_n}{n}.$$
Someone's Theorem is immediate from the following:
Theorem If $na_n\to0$ then $s_n-\sigma_n\to0$.
For any sequence $a=(a_1,\dots)$ define $$Ma=\sup_n|na_n|.$$
The theorem follows from the following "maximal inequality":
Lemma $|s_n-\sigma_n|\le Ma$ for every $n$.
Proof: First note that $$\sigma_n=\sum_{k=1}^n\frac{n-k+1}{n}a_k.$$
(Simply count the number of times each $a_k$ appears in $\sigma_n$.) Hence
$$s_n-\sigma_n=\sum_{k=1}^n\frac{k-1}{n}a_k.$$Since $|k-1|\le k$ this shows that $$|s_n-\sigma_n|\le Ma\sum_{k=1}^n\frac1n=Ma.$$
QED.
Now to prove the theorem. Suppose $na_n\to0$. Let $\epsilon>0$. Choose $N$ so $|na_n|<\epsilon$ for all $n>N$, and define $$a_n'=\begin{cases}
a_n,&(1\le n\le N),
\\0,&(n>N)
\end{cases}$$and $$a_n''=a_n-a_n'.$$In what one hopes is transparent notation, it is clear that $$s_n'-\sigma_n'\to0$$and $$Ma''\le\epsilon.$$So
$$\limsup|s_n-\sigma_n|\le\limsup|s_n'-\sigma_n'|+\sup|s_n''-\sigma_n''|
\le Ma''\le\epsilon.$$Hence $\limsup|s_n-\sigma_n|=0$. QED.
Bonus: Tauber's Theorem
Say $$f(r)=\sum_{k=0}^\infty a_kr^k\quad(0<r<1)$$and $s_n=\sum_{k=0}^na_k$.
Tauber's Theorem If $na_n\to0$ and $\lim_{r\to1}f(r)=s$ then $\sum_{k=0}^\infty a_k=s$.
As with Someone's Theorem, this follows from the somewhat stronger result
Theorem If $na_n\to0$ then $s_n-f(1-1/n)\to0$.
And that follows by an argument as above if we can show that
$$\left|s_n-f\left(1-\frac1n\right)\right|\le cMa.$$To begin, it's clear that $$\left|s_n-f\left(1-\frac1n\right)\right|\le Ma\sum_{k=1}^n\frac1k\left(1-\left(1-\frac1n\right)^k\right)
+Ma\sum_{k=n+1}^\infty\frac1k\left(1-\frac1n\right)^k,$$so we need only show that both sums on the right are bounded (independent of $n$).
There exist $\alpha$ and $\beta$ with $$0<\alpha<\left(1-\frac1n\right)^n<\beta<1\quad(n\ge2).$$So
$$
\sum_{k=1}^n\frac1k\left(1-\left(1-\frac1n\right)^k\right)\le
\sum_{k=1}^n\frac1k\left(1-\alpha^{k/n}\right)\le
\int_0^n\left(1-\alpha^{t/n}\right)\frac{dt}{t}
=\int_0^1\left(1-\alpha^{t}\right)\frac{dt}{t}$$
and similarly
$$\sum_{k=n+1}^\infty\frac1k\left(1-\frac1n\right)^k
\le \sum_{k=n+1}^\infty\frac1k\beta^{k/n}
\le\int_n^\infty\beta^{t/n}\frac{dt}{t}
=\int_1^\infty\beta^t\frac{dt}{t}.$$
Best Answer
I don't think there's a very definite question up there, but I understand why it might have been hard for you to pinpoint exactly what was troubling you, or to even express it efficiently supposing you could pinpoint it.
So what I'll just do is answer the titular question (by explaining the link between sequences and series), and then I'll talk about summability.
First of all, a ramble about terminology. The word sequence in modern mathematics is often exclusively reserved for a succession of objects -- thus a sequence of steps, a sequence of notes, a sequence of operations, a sequence of numbers, a sequence of sequences, and so on. However, both in nonmathematical speech and in ancient mathematics, this word was used more or less synonymously with the word series, which in those contexts mean quite the same thing. So we have a series of tones, a series of blasts, etc. However, in contemporary mathematics the word series is often exclusively used to refer to a particular type of sequence that has become very ubiquitous in mathematics -- a sequence of additions. Thus by a series is meant a sequence of addition operations. For example, start with any real number you like, and transform it by adding to it any real number you like (positive or negative or zero), then another to the sum, then another to the new sum, and so on, and you have what is called a series. What I have actually done is explain in words what's meant by the (quite informal) symbol $$a_1+a_2+a_3+a_4+a_5+a_6+\cdots,$$ which is more formally written as $$\sum_{i\ge 1}{a_i}.$$ That is, a series is just a way of representing a sequence of cumulative sums, each sum in the $i$th position in the sequence consisting of $i$ summands. This sequence is what's called the sequence of partial sums of the series.
In this way you see the link between series and sequences; every series is actually a sequence of sums as described above; on the flip side every sequence can be thought of as some series, namely the series obtained by summing the differences between consecutive terms of the sequence to its first term.
Finally, we can talk about infinite series; that simply means that the addition does not end. This question arises because such series do not always give a number as a result even if the terms are all numbers. They exhibit different behaviours; some series simply accumulate to become larger than any number; some don't even have a predictable behavior, changing course every now and then. The definition given is to pin down those series of numbers that -- though infinite -- add up to a real number. These are the ones called convergent, or summable.
The definition simply asks you to consider the sequence of partial sums of a series, and to see if this sequence converges (becomes stable and settles to a real number); if so, it says that such series will also be called convergent, or summable.
In sum, it's all about sequences. A series is just a sequence of additions, or a sequence of partial sums; conversely, every sequence of addable objects (in particular, numbers), can be decomposed into a series.