It is standard result that if $Y$ is a non-negative random variable then $\sum_n P\{Y>n\} <\infty$ iff $EY<\infty$. Taking $Y=\frac {|X_1|} {\epsilon}$ we get $\sum P\{|X_n| >n\epsilon \}=\sum P\{|X_1| >n\epsilon \}<\infty$. Apply Borel Cantelli Lemma and let $\epsilon \to 0$ through the sequence $1,\frac 1 2,\frac 1 3,\cdots$ to see that $\frac {|X_n|} n \to 0$ almost surely. If $a_n \geq 0$ and $\frac {a_n} n \to 0$ then $\frac {\max \{a_1,a_2,\cdots,a_n\}} n \to 0$. Now write $\frac {\max \{|X_1|,|X_2|,\cdots, |X_n|\}} {S_n}$ as $\frac {\max \{|X_1|,|X_2|,\cdots, |X_n|\}} n \times \frac n {S_n}$ and apply SLLN.
So to write your definition more precisely, I think that the book is saying the following:
Let $X_n$ be a sequence of random variables, and $X$ be another random variable on the same probability space $(\Omega, \mathcal F, \mathbb P)$. Let
$A = \{\omega \in \Omega : \text{ for all }\epsilon > 0, \text{ there exists } N(\omega, \epsilon) \in \mathbb N \text{ such that } n \geq N \Rightarrow \left|X(\omega) - X_n(\omega)\right| < \epsilon\}$.
Then we say that $X_n$ converges almost surely to $X$ if $\mathbb P[A] = 1$.
This is equivalent to the following statement: $X_n$ is said to converge to $X$ almost surely if there exists a set $N \in \mathcal F$ with $\mathbb P[N] = 0$ such that for all $\omega \in \Omega \backslash N$, and for all $\epsilon > 0$, there exists some $N(\omega, \epsilon)\in \mathbb N$ such that $n \geq N \Rightarrow \left|X(\omega) - X_n(\omega)\right| < \epsilon$.
The problem with your statement in your example is that you've gotten the order mixed up. In the definition, $N$ depends both on $\omega$ and $\epsilon$. It is true that $\mathbb P(\{|X_N - 0| : \omega \in \Omega\}) < 1$, but this $N$ does not depend on $\omega$, which it needs to.
A quick bit about intuition: $X_n$ converges almost surely to $X$ if it converges pointwise everywhere except on a set of measure 0. This convergence need not be uniform. Just like a sequence of real-valued functions $f_n:\mathbb R \rightarrow \mathbb R$ might converge for every $x \in \mathbb R$, it may require a different $N$ for each $x$. Similarly, if $X_n$ converges almost surely to $X$, you can still have that the convergence is not uniform -- i.e. for different $\omega$'s, you need a different $N$.
Hope this helps.
Best Answer
Fix $\varepsilon>0$. Note that for $\alpha\geq 0$, $\sum P(|X_n|>\varepsilon)=\sum_{n=m}^\infty 1/n=\infty$ where $m$ is chosen so that, $n^\alpha>\epsilon$ for $n\geq m$. For $\alpha<0$, $\sum P(|X_n|>\varepsilon)<\infty$ since $n^{\alpha}<\epsilon$ for sufficiently large $n$.
The Borel Cantelli lemma implies that $X_{n}\to 0$ a.s. iff $\alpha<0$.