Fix $\epsilon > 0$ and $n \in \mathbb{N}$, then we can use Chebyshev's inequality to see that
$$\mathbb{P}\bigg[\Big|\frac{S_n}{n} - \mathbb{E}[X_1]\Big| > \epsilon\bigg] \leq \frac{\text{Var}\Big(\frac{S_n}{n}\Big)}{\epsilon^2}$$
where
$$\displaystyle \text{Var}\Big(\frac{S_n}{n}\Big)= \frac{\text{Var}(S_n)}{n^2} \leq \frac{\sum_{i=1}^n\sum_{j=1}^n \text{Cov}{(X_i,X_j)}}{n^2} \leq \frac{\sum_{i=1}^n\sum_{j=1}^n c^{|i-j|}}{n^2} $$
We can then explicitly calculate the double sum $\sum_{i=1}^n\sum_{j=1}^n c^{|i-j|}$ as follows:
$$\begin{align}
\sum_{i=1}^n\sum_{j=1}^n c^{|i-j|} &= \sum_{i=1}^n c^{|i-i|} + 2\sum_{i=1}^n\sum_{j=1}^{i-1} c^{|i-j|} \\
&= n + 2\sum_{i=1}^n\sum_{j=1}^{i-1} c^{|i-j|} \\
&= n + 2\sum_{i=1}^n\sum_{j=1}^{i-1} c^{i-j} \\
&= n + 2\sum_{i=1}^n c^i \frac{1 - c^{-i}}{1-c^{-1}} \\
&= n + 2\sum_{i=1}^n \frac{c^i + 1}{1-c^{-1}} \\
&= n + \frac{2c}{c-1} \sum_{i=1}^n c^{i}-1 \\
&= n + \frac{2c}{c-1} \big(\frac{1-c^{n+1}}{1-c} -n \big)\\
&= n + \frac{2c}{(c-1)^2}(c^{n+1}+1) + \frac{2c}{c-1}n\\
\
\end{align}$$
Thus,
$$\lim_{n\rightarrow\infty} \mathbb{P}\bigg[\Big|\frac{S_n}{n} - \mathbb{E}[X_1]\Big| > \epsilon\bigg] = \lim_{n\rightarrow\infty} \frac{\text{Var}\Big(\frac{S_n}{n}\Big)}{\epsilon^2} \leq \lim_{n\rightarrow\infty} \frac{n + \frac{2c}{(c-1)^2}(c^{n+1}+1) + \frac{2c}{c-1}n}{n^2 \epsilon^2} = 0 $$
Seeing how our choice of $\epsilon$ was arbitrary, the statement above holds for any $\epsilon > 0 $ and shows that $\frac{S_n}{n} \rightarrow E[X_1]$ in probability, as desired.
This shows the validity of the theorem for $c<1$, but not for $c=1$.
We can easily extend the demonstration to all cases in which $|\mbox{Cov}(X_i,X_j)|\le f_{|i-j|}$ where $\lim_{i\to\infty}f_i=0$. Indeed in this case it is simple to show that
$$\lim_{n\to\infty}{1\over n^2}\sum_{i=1}^n\sum_{j=1}^n \text{Cov}{(X_i,X_j)}\le \lim_{n\to\infty}{1\over n^2}\sum_{i=1}^n\sum_{j=1}^n |\text{Cov}{(X_i,X_j)}|\le \lim_{n\to\infty}{1\over n^2}\sum_{i=1}^n\sum_{j=1}^nf_{|i-j|}=0$$
To show convergence in probability,
$$ \mathbb E[X_n] = \frac{n+1}{2(n+1)\log(n+1)} - \frac{n+1}{2(n+1)\log(n+1)} = 0$$
and $$
\mathrm{Var}(X_n)=\mathbb E[X_n^2]
= \frac{2(n+1)^2}{2(n+1)\log(n+1)} = \frac{n+1}{\log(n+1)}.
$$
Hence $\mathbb E[S_n]=0$, and
$$\begin{align*}
\frac1{\varepsilon^2}\mathbb E\left[\left(\frac{S_n}n\right)^2\right] &= \frac1{n^2\varepsilon^2} \mathbb E[S_n^2]\\
&= \frac1{n^2\varepsilon^2} \mathrm{Var}\left(\sum_{i=1}^n X_i\right)\\
&= \frac1{n^2\varepsilon^2} \sum_{i=1}^n \mathrm{Var}(X_i)\\
&= \frac1{n^2\varepsilon^2} \sum_{i=1}^n \frac{i+1}{\log(i+1)}\\
&\leqslant \frac1{n^2\varepsilon^2}\left(\frac{n(n+1)}{\log(n+1)}\right)\\
&= \frac{n+1}{n\log(n+1)\varepsilon^2}\\
&= \frac1{\log(n+1)\varepsilon^2} + \frac1{n\log(n+1)\varepsilon^2}\stackrel{n\to\infty}{\longrightarrow}0.
\end{align*}$$
By Markov's inequality,
$$ \mathbb P\left(\frac{S_n}n \geqslant\varepsilon \right)\leqslant \frac{\mathbb E\left[\left(\frac{S_n}n\right)^2 \right]}{\varepsilon^2}\stackrel{n\to\infty}{\longrightarrow}0.$$
To show that the convergence is not almost sure, for each $n$ we have, as pointed out by @Frank
$$ \{X_n=n+1\} \subset \left\{|S_n|\geqslant \frac n2\right\} \cup \left\{|S_{n-1}|\geqslant \frac n2\right\}.$$
Since $$\sum_{n=1}^\infty \mathbb P(X_n=n+1) = \sum_{n=1}^\infty\frac1{2(n+1)\log(n+1)}=+\infty,$$
by Borel-Cantelli we have $$\limsup_{n\to\infty} \mathbb P\left(\frac{|S_n|}n\geqslant \frac12\right)=1,$$
and hence $$\mathbb P\left(\lim_{n\to\infty} \frac{S_n}n = 0\right)<1.$$
Best Answer
\begin{eqnarray} \begin{split} & \mathbb P\left(\left| \frac{S_n - n\mathbb{E}X\mathbb{1}_{\{|X|\leq n \}}}{n}\right| \geq \varepsilon\right)\cr & = \mathbb P\left(\left| \frac{S_n^* - n\mathbb{E}X\mathbb{1}_{\{|X|\leq n \}}}{n}\right|\geq \varepsilon, |X_1|\leq n,\ldots, |X_n|\leq n\right) \cr & + \mathbb P\left(\left| \frac{S_n - n\mathbb{E}X\mathbb{1}_{\{|X|\leq n \}}}{n}\right|\geq \varepsilon, |X_i|> n \text{ for some } i=1,\ldots, n\right) \cr & \leq \mathbb P\left(\left| \frac{S_n^* - n\mathbb{E}X\mathbb{1}_{\{|X|\leq n \}}}{n}\right|\geq \varepsilon\right)+n\mathbb P\left(|X_1|> n\right). \end{split}\tag{1} \end{eqnarray} The second term tends to zero. Consider the first term and apply Chebyshev's inequality. $$ \mathbb P\left(\left| \frac{S_n^* - n\mathbb{E}X\mathbb{1}_{\{|X|\leq n \}}}{n}\right|\geq \varepsilon\right) \leq \frac{\sum_{i=1}^n \text{Var}(X_i^*)}{n^2\varepsilon^2} = \frac{\text{Var}(X_1^*)}{n\varepsilon^2} \to 0. $$
Note also, that you are proved WLLN in the conditions that expectation $\mathbb EX$ need not to exist. The only thing that is given to you is the condition $n\mathbb P(|X| >n)\to 0$, which is more weak condition than $\mathbb E|X|<\infty$.
The hint to use Chebyshev's inequality did not mean that expectation exists. This is just a tool that can be applied in this problem to those random variables that have a expectation.
Addition: The inequality in (1) follows from the inequality $\mathbb P(A\cap B)\leq \mathbb P(A)$ (as well as $\mathbb P(B)$). So $$ \mathbb P\left(\left| \frac{S_n^* - n\mathbb{E}X\mathbb{1}_{\{|X|\leq n \}}}{n}\right|\geq \varepsilon, |X_1|\leq n,\ldots, |X_n|\leq n\right)\leq \mathbb P\left(\left| \frac{S_n^* - n\mathbb{E}X\mathbb{1}_{\{|X|\leq n \}}}{n}\right|\geq \varepsilon\right) $$ and $$ \mathbb P\left(\left| \frac{S_n - n\mathbb{E}X\mathbb{1}_{\{|X|\leq n \}}}{n}\right|\geq \varepsilon, |X_i|> n \text{ for some } i=1,\ldots, n\right) $$ $$\leq \mathbb P\left(\bigcup_{i=1}^n\{|X_i|> n\}\right)\leq \sum_{i=1}^n P\left(|X_i|> n\right)=n\mathbb P(|X_1|>n). $$