Fix $\epsilon > 0$ and $n \in \mathbb{N}$, then we can use Chebyshev's inequality to see that
$$\mathbb{P}\bigg[\Big|\frac{S_n}{n} - \mathbb{E}[X_1]\Big| > \epsilon\bigg] \leq \frac{\text{Var}\Big(\frac{S_n}{n}\Big)}{\epsilon^2}$$
where
$$\displaystyle \text{Var}\Big(\frac{S_n}{n}\Big)= \frac{\text{Var}(S_n)}{n^2} \leq \frac{\sum_{i=1}^n\sum_{j=1}^n \text{Cov}{(X_i,X_j)}}{n^2} \leq \frac{\sum_{i=1}^n\sum_{j=1}^n c^{|i-j|}}{n^2} $$
We can then explicitly calculate the double sum $\sum_{i=1}^n\sum_{j=1}^n c^{|i-j|}$ as follows:
$$\begin{align}
\sum_{i=1}^n\sum_{j=1}^n c^{|i-j|} &= \sum_{i=1}^n c^{|i-i|} + 2\sum_{i=1}^n\sum_{j=1}^{i-1} c^{|i-j|} \\
&= n + 2\sum_{i=1}^n\sum_{j=1}^{i-1} c^{|i-j|} \\
&= n + 2\sum_{i=1}^n\sum_{j=1}^{i-1} c^{i-j} \\
&= n + 2\sum_{i=1}^n c^i \frac{1 - c^{-i}}{1-c^{-1}} \\
&= n + 2\sum_{i=1}^n \frac{c^i + 1}{1-c^{-1}} \\
&= n + \frac{2c}{c-1} \sum_{i=1}^n c^{i}-1 \\
&= n + \frac{2c}{c-1} \big(\frac{1-c^{n+1}}{1-c} -n \big)\\
&= n + \frac{2c}{(c-1)^2}(c^{n+1}+1) + \frac{2c}{c-1}n\\
\
\end{align}$$
Thus,
$$\lim_{n\rightarrow\infty} \mathbb{P}\bigg[\Big|\frac{S_n}{n} - \mathbb{E}[X_1]\Big| > \epsilon\bigg] = \lim_{n\rightarrow\infty} \frac{\text{Var}\Big(\frac{S_n}{n}\Big)}{\epsilon^2} \leq \lim_{n\rightarrow\infty} \frac{n + \frac{2c}{(c-1)^2}(c^{n+1}+1) + \frac{2c}{c-1}n}{n^2 \epsilon^2} = 0 $$
Seeing how our choice of $\epsilon$ was arbitrary, the statement above holds for any $\epsilon > 0 $ and shows that $\frac{S_n}{n} \rightarrow E[X_1]$ in probability, as desired.
This shows the validity of the theorem for $c<1$, but not for $c=1$.
We can easily extend the demonstration to all cases in which $|\mbox{Cov}(X_i,X_j)|\le f_{|i-j|}$ where $\lim_{i\to\infty}f_i=0$. Indeed in this case it is simple to show that
$$\lim_{n\to\infty}{1\over n^2}\sum_{i=1}^n\sum_{j=1}^n \text{Cov}{(X_i,X_j)}\le \lim_{n\to\infty}{1\over n^2}\sum_{i=1}^n\sum_{j=1}^n |\text{Cov}{(X_i,X_j)}|\le \lim_{n\to\infty}{1\over n^2}\sum_{i=1}^n\sum_{j=1}^nf_{|i-j|}=0$$
$$\sup_{n} E[|X_n| 1_{|X_n|>M} ] \leq \sup_{n} E[|X_n|] \leq \sup_{n} E[|X_n|^p],$$
using Jensen.
You didn't apply Jensen's inequality correctly; it should read
$$\sup_{n} E[|X_n| 1_{|X_n|>M} ] \leq \sup_{n} E[|X_n|] \leq \sup_{n} \left( E[|X_n|^p] \right)^{\color{red}{\frac{1}{p}}}.$$
[...] and the claim follows by letting $M \rightarrow \infty$.
No, it's not that simple. Letting $M \to \infty$ you get
$$\lim_{M \to \infty} \sup_n \mathbb{E}(|X_n| 1_{|X_n|>M}) \leq \sup_{n \in \mathbb{N}} \|X_n\|_p,$$
but that's not good enough; you have to show that the limit equals $0$. Hint for this problem: Use Markov's inequality, i.e.
$$\mathbb{E}(|X_n| 1_{\{|X_n|>M}) \leq \frac{1}{M^{p-1}} \mathbb{E}(|X_n|^p 1_{|X_n|>M}) \leq \frac{1}{M^{p-1}} \mathbb{E}(|X_n|^p).$$
Define $$M_0:=\max_{n \in N} |X_n|.$$ Then we have $$E[|X_n| 1_{|X_n|>M_0}]= E[|X_n|\cdot 0 ] = 0,$$
No this doesn't work, because $M_0$ depends on $\omega$. Unfortunately, this means that your approach fails. Hint for this one: Using e.g. the dominated convergence theorem check first that the set $\{f\}$ is uniformly integrable. Extend the approach to finitely many integrable random variables.
When $E[\sup_n |X_n|] < \infty$, then the sequence is uniformly integrable.
Hint: By assumption, $Y := \sup_n |X_n|$ is integrable and $|X_n| \leq Y$ for all $n \in \mathbb{N}$. Consequently,
$$\mathbb{E}(|X_n| 1_{|X_n|>M}) \leq \mathbb{E}(|Y| 1_{|Y|>M}) \qquad \text{for all $M>0$ and $n \in \mathbb{N}$.}$$
Now use the fact that $\{Y\}$ is uniformly integrable (see question nr. 2).
Best Answer
Your $H_n$ are bounded in $L^2$. Note that $$\mathbb{E}[H_n^2] = \mathbb{E}[n^{-1} \max_{k \leq n} |X_k|^2] \leq \mathbb{E}[n^{-1} \sum_{k \leq n} |X_k|^2] = n^{-1} \sum_{k \leq n} \mathbb{E}[|X_1|^2] = \mathbb{E}[|X_1|^2] < \infty.$$ Since this bound is independent of $n$ we are done.