You have shown that
$$\mathbb{P} \left( \left| \frac{X_{n}}{n} -1 \right| \geq \epsilon \right) \leq \frac{C}{\epsilon^2 n^{(2-r)}} \tag{0}$$
for each $n \in \mathbb{N}$, and this yields $X_n/n \to 1$ in probability. Consequently, there exists a subsequence, say, $(X_{n_k}/n_k)_{k \geq 1}$ which converges almost surely to $1$. One could hope to use some sandwich argument (based on the monotonicity of the $X_n$) to deduce that $X_n/n \to 1$ almost surely. The problem is that this approach will require some information about the growth of $n_k$ - and this, we don't have. To get rid of this problem, we will apply $(0)$ directly to a subsequence of our choice.
For fixed $k \in \mathbb{N}$, it follows from $(0)$ that $$\mathbb{P} \left( \left| \frac{X_{n^k}}{n^k} -1 \right| \geq \epsilon \right) \leq \frac{\text{var}(X_{n^k}/n^k)}{\epsilon^2} = \frac{\text{var}(X_{n^k})}{\epsilon^2 n^{2k}} \leq \frac{C}{\epsilon^2 n^{k(2-r)}}.$$
For given $r \in (0,2)$, we can choose $k=k(r)$ sufficiently large such that
$$\sum_{k \geq 1} \mathbb{P} \left( \left| \frac{X_{n^k}}{n^k} -1 \right| \geq \epsilon \right) \leq \frac{C}{\epsilon^2 } \sum_{k \geq 1}\frac{1}{n^{k(2-r)}} < \infty.$$
By Borel Cantelli, this implies $X_{n^k}(\omega)/n^k \to 1$ for almost all $\omega$. Fix any such $\omega$ and $\epsilon>0$. Then there exists a number $N \in \mathbb{N}$ such that
$$1-\epsilon \leq \frac{X_{n^k}(\omega)}{n^k} \leq 1+\epsilon \quad \text{and} \quad 1-\epsilon \leq \frac{(n+1)^k}{n^k} \leq 1+\epsilon \quad \text{for all $n \geq N$.} \tag{1}$$
Now pick some $m \geq N^k$ and choose $n \geq N$ such that $m \in [n^k,n^{k+1})$. Since the sequence $X_i$ is monotone, we have
$$\frac{X_{n^k}(\omega)}{n^k} \leq \frac{X_m(\omega)}{n^k} = \frac{X_m(\omega)}{m} \frac{m}{n^k} \leq \frac{X_m(\omega)}{m} \frac{(n+1)^k}{n^k},$$
i.e.
$$\frac{n^k}{(n+1)^k} \frac{X_{n^k}(\omega)}{n^k} \leq \frac{X_m(\omega)}{m}. \tag{2}$$
On the other hand, we also have
$$\frac{X_m(\omega)}{m} \leq \frac{X_{(n+1)^k}(\omega)}{m} = \frac{X_{(n+1)^k}(\omega)}{(n+1)^k} \frac{(n+1)^k}{m} \leq \frac{X_{(n+1)^k}(\omega)}{(n+1)^k} \frac{(n+1)^k}{n^k}.$$
Combining this with $(2)$, we get
$$\frac{n^k}{(n+1)^k} \frac{X_{n^k}(\omega)}{n^k} \leq \frac{X_m(\omega)}{m} \leq \frac{X_{(n+1)^k}(\omega)}{(n+1)^k} \frac{(n+1)^k}{n^k}.$$
Hence, by $(1)$,
$$\frac{1}{1+\epsilon} \frac{1}{1-\epsilon} \leq \frac{X_m(\omega)}{m} \leq (1+\epsilon)^2.$$
As $\epsilon>0$ is arbitrary, this proves $X_m(\omega)/m \to 1$ as $m \to \infty$. Hence, $X_m/m\to 1$ almost surely.
Best Answer
You understood perfectly the problem with your construction, but luckily there is a way to solve the dependence on $\varepsilon.$ Indeed you can choose $a_n$ so that the sequence diverges and this does the trick. For example choose $b_n$ such that: $$ \mathbb{P}(|X_n| \ge b_n) \le \frac{1}{2^n} $$ (such a sequence can always be chosen). Then choose through an iteration $a_0 = 1$ and $$ a_n = b_n^2 \vee (a_{n-1}+1) $$ and notice that in this way $a_n \to {+}\infty.$ Hence if we go back to the original sum we can estimate: $$ \sum_{n} P \Big(\Big|\frac{X_n}{a_n}\Big|\geq \varepsilon \Big) \le \sum_{n} P \Big(\Big|\frac{X_n}{b_n} \Big|\geq \varepsilon \sqrt{a_n} \Big)< {+} \infty $$ since for any $\varepsilon > 0$ there exists an $n_0 \ge \varepsilon$ such that $\sqrt{a_n} \varepsilon \ge 1$ for all $n \ge n_0$ and thus we can simply sum up the geometric series after $n_0$.