We can assume that $m=0$. Then take $\delta\gt 0$ write for a fixed $C$:
$$\frac 1{n^2} \mathbb E[S_n^2]\leqslant \frac 1{n^2}\sum_{i,j=1}^n\varepsilon_{|i-j|}
[|i-j|\leqslant C]+\frac 1{n^2}\sum_{i,j=1}^n\varepsilon_{|i-j|}
[|i-j|\gt C].$$
If we choose it such that $\varepsilon_k \leqslant \delta$ if $k\geqslant C$, then we have
$$\frac 1{n^2} \mathbb E[S_n^2]\leqslant \frac{2Cn}{n^2}\sup_k \varepsilon_k+\delta,$$
hence the sequence $\left(\mathbb E[S_n^2]/n^2\right)_{n\geqslant 1}$ converges to $0$.
Once we showed the good asymptotic of $\mathbb E[S_n^2]$, an application of the Borel-Cantelli's lemma yields $2^{-n}S_{2^n}\to 0$ almost surely. Indeed, define for a fixed $j$ the event $A_n:=\{|S_{2^n}|\gt 2^n/j\}$ for a fixed $\delta$. Then the series $\sum_n\mathbb P(A_n)$ is convergent, and we get $\mathbb P\left(\limsup_{n\to\infty} A_n \right)=0$. As a consequence, on a set $\Omega_j$ of probability one, we have $|S_{2^n}(\omega)| 2^n\lt1/j $ for $n\geqslant N(j,\omega)$.
In order to deduce the result for the whole sequence, note that
$$\mu\left\{\max_{2^n\leqslant k\lt 2^{n+1}}\frac{|S_k|}{2^n}>\delta\right\}
\leqslant 2^n\max_{2^n\leqslant k\lt 2^{n+1}}\mu\left\{\frac{|S_k|}{2^n}>\delta\right\}\\
\leqslant 2^n\max_{2^n\leqslant k\lt 2^{n+1}}\frac 1{\delta^2}\mathbb E\left[\frac{S_k^2}{2^{2n}}\right].$$
There is likely a proof somewhere on this site but I could not find it. Here I give a quick proof of my comment (since I originally mis-stated the result by forgetting the "lower bounded" restriction):
Let $\{X_i\}_{i=1}^{\infty}$ be a sequence of random variables, not necessarily identically distributed and not necessarily independent, that satisfy:
i) $E[X_i]=m_i$, where $m_i \in \mathbb{R}$ for all $i\in\{1, 2, 3, ...\}$.
ii) There is a constant $\sigma^2_{bound}$ such that $Var(X_i) \leq \sigma^2_{bound}$ for all $i \in \{1, 2, 3, ...\}$.
iii) The variables are pairwise uncorrelated, so $E[(X_i-m_i)(X_j-m_j)]=0$ for all $i \neq j$.
iv) There is a value $b \in \mathbb{R}$ such that, with prob 1, $X_i-m_i\geq b$ for all $i \in \{1, 2, 3, ...\}$.
Define $L_n = \frac{1}{n}\sum_{i=1}^n (X_i-m_i)$. Then $L_n\rightarrow 0$ with prob 1.
Proof: Since the variables are pairwise uncorrelated with bounded variance, we easily find for all $n$:
$$ E[L_n^2] = \frac{1}{n^2}\sum_{i=1}^n \sigma_i^2 \leq \frac{\sigma_{bound}^2}{n} $$
Fix $\epsilon>0$. It follows that:
$$ P[|L_n|>\epsilon] = P[L_n^2 \leq \epsilon^2] \leq \frac{E[L_n^2]}{\epsilon^2} \leq \frac{\sigma_{bound}^2}{n\epsilon^2} $$
Hence:
$$ \sum_{n=1}^{\infty} P[|L_{n^2}|>\epsilon] \leq \sum_{n=1}^{\infty}\frac{\sigma_{bound}^2}{n^2\epsilon^2} < \infty $$
and so $L_{n^2}\rightarrow 0$ with probability 1 by the Borel-Cantelli Lemma. That is, the $L_n$ values converge over the sparse subsequence $n\in\{1, 4, 9, 16, ...\}$.
Since $L_n \geq b$ for all $n$ and $L_{n^2}\rightarrow 0$ with probability 1, it can be shown that $L_n\rightarrow 0$ with probability 1. $\Box$
The lower bounded condition is typically treated by writing $X_n = X_n^+ - X_n^-$ where $X_n^+$ and $X_n^-$ are nonnegative and defined $X_n^+=\max[X_n,0]$, $X_n^-=-\min[X_n,0]$. If $X_n$ and $X_i$ are independent, then $X_n^+$ and $X_i^+$ are also independent. So the lower bounded condition can be removed for the case when variables are independent. However, if $X_n$ and $X_i$ are uncorrelated, that does not mean $X_n^+$ and $X_i^+$ are uncorrelated. So it is not clear to me if the lower-bounded condition can be removed when "independence" is replaced by the weaker condition "pairwise uncorrelated."
Best Answer
Fix $\epsilon > 0$ and $n \in \mathbb{N}$, then we can use Chebyshev's inequality to see that
$$\mathbb{P}\bigg[\Big|\frac{S_n}{n} - \mathbb{E}[X_1]\Big| > \epsilon\bigg] \leq \frac{\text{Var}\Big(\frac{S_n}{n}\Big)}{\epsilon^2}$$
where
$$\displaystyle \text{Var}\Big(\frac{S_n}{n}\Big)= \frac{\text{Var}(S_n)}{n^2} \leq \frac{\sum_{i=1}^n\sum_{j=1}^n \text{Cov}{(X_i,X_j)}}{n^2} \leq \frac{\sum_{i=1}^n\sum_{j=1}^n c^{|i-j|}}{n^2} $$
We can then explicitly calculate the double sum $\sum_{i=1}^n\sum_{j=1}^n c^{|i-j|}$ as follows:
$$\begin{align} \sum_{i=1}^n\sum_{j=1}^n c^{|i-j|} &= \sum_{i=1}^n c^{|i-i|} + 2\sum_{i=1}^n\sum_{j=1}^{i-1} c^{|i-j|} \\ &= n + 2\sum_{i=1}^n\sum_{j=1}^{i-1} c^{|i-j|} \\ &= n + 2\sum_{i=1}^n\sum_{j=1}^{i-1} c^{i-j} \\ &= n + 2\sum_{i=1}^n c^i \frac{1 - c^{-i}}{1-c^{-1}} \\ &= n + 2\sum_{i=1}^n \frac{c^i + 1}{1-c^{-1}} \\ &= n + \frac{2c}{c-1} \sum_{i=1}^n c^{i}-1 \\ &= n + \frac{2c}{c-1} \big(\frac{1-c^{n+1}}{1-c} -n \big)\\ &= n + \frac{2c}{(c-1)^2}(c^{n+1}+1) + \frac{2c}{c-1}n\\ \ \end{align}$$
Thus,
$$\lim_{n\rightarrow\infty} \mathbb{P}\bigg[\Big|\frac{S_n}{n} - \mathbb{E}[X_1]\Big| > \epsilon\bigg] = \lim_{n\rightarrow\infty} \frac{\text{Var}\Big(\frac{S_n}{n}\Big)}{\epsilon^2} \leq \lim_{n\rightarrow\infty} \frac{n + \frac{2c}{(c-1)^2}(c^{n+1}+1) + \frac{2c}{c-1}n}{n^2 \epsilon^2} = 0 $$
Seeing how our choice of $\epsilon$ was arbitrary, the statement above holds for any $\epsilon > 0 $ and shows that $\frac{S_n}{n} \rightarrow E[X_1]$ in probability, as desired.
This shows the validity of the theorem for $c<1$, but not for $c=1$. We can easily extend the demonstration to all cases in which $|\mbox{Cov}(X_i,X_j)|\le f_{|i-j|}$ where $\lim_{i\to\infty}f_i=0$. Indeed in this case it is simple to show that $$\lim_{n\to\infty}{1\over n^2}\sum_{i=1}^n\sum_{j=1}^n \text{Cov}{(X_i,X_j)}\le \lim_{n\to\infty}{1\over n^2}\sum_{i=1}^n\sum_{j=1}^n |\text{Cov}{(X_i,X_j)}|\le \lim_{n\to\infty}{1\over n^2}\sum_{i=1}^n\sum_{j=1}^nf_{|i-j|}=0$$