Your estimate gives that for each positive $\varepsilon$, we have
$$\sum_i\mathbb P(M_{2^{i}}>(1+\varepsilon)\sqrt 2\sqrt{i+1})<\infty$$
(we use the fact that $\mathbb P(M_n>x)\leqslant n\mathbb P(X_1\gt x)$).
We thus deduce that $\limsup_iM_{2^{i}}/(\sqrt{2(i+1)})\leqslant 1+\varepsilon$ almost surely. Taking $\varepsilon:=1/k$, we get $\limsup_iM_{2^{i}}/(\sqrt{2(i+1)})\leqslant 1$ almost surely. To conclude, notice that if $2^i\leqslant n\lt 2^{i+1}$,
$$\frac{M_n}{\sqrt{2\log n}}\leqslant \frac{M_{2^{i+1}}}{\sqrt{2i}}.$$
For the $\liminf$, define for a fixed positive $\varepsilon$, $$A_n:=\left\{\frac{M_n}{\sqrt{2\log n}}\lt 1-\varepsilon\right\}.$$
A use of the estimate on the tail of the normal distribution show that the series $\sum_n \mathbb P(A_n)$ is convergent, hence by the Borel-Cantelli lemma, we have $\mathbb P\left(\limsup_n A_n\right)=0$. This means that for almost every $\omega$, we can find $N=N(\omega)$ such that if $n\geqslant N(\omega)$, then $$\frac{M_n(\omega)}{\sqrt{2\log n}}\geqslant 1-\varepsilon.$$
This implies that $$\liminf_{n\to \infty}\frac{M_n(\omega)}{\sqrt{2\log n}}\geqslant 1-\varepsilon \quad \mbox{a.e.}$$
Here again, taking $\varepsilon:=1/k$, we obtain that
$$\liminf_{n\to \infty}\frac{M_n(\omega)}{\sqrt{2\log n}}\geqslant 1\quad \mbox{a.e.}$$
So to write your definition more precisely, I think that the book is saying the following:
Let $X_n$ be a sequence of random variables, and $X$ be another random variable on the same probability space $(\Omega, \mathcal F, \mathbb P)$. Let
$A = \{\omega \in \Omega : \text{ for all }\epsilon > 0, \text{ there exists } N(\omega, \epsilon) \in \mathbb N \text{ such that } n \geq N \Rightarrow \left|X(\omega) - X_n(\omega)\right| < \epsilon\}$.
Then we say that $X_n$ converges almost surely to $X$ if $\mathbb P[A] = 1$.
This is equivalent to the following statement: $X_n$ is said to converge to $X$ almost surely if there exists a set $N \in \mathcal F$ with $\mathbb P[N] = 0$ such that for all $\omega \in \Omega \backslash N$, and for all $\epsilon > 0$, there exists some $N(\omega, \epsilon)\in \mathbb N$ such that $n \geq N \Rightarrow \left|X(\omega) - X_n(\omega)\right| < \epsilon$.
The problem with your statement in your example is that you've gotten the order mixed up. In the definition, $N$ depends both on $\omega$ and $\epsilon$. It is true that $\mathbb P(\{|X_N - 0| : \omega \in \Omega\}) < 1$, but this $N$ does not depend on $\omega$, which it needs to.
A quick bit about intuition: $X_n$ converges almost surely to $X$ if it converges pointwise everywhere except on a set of measure 0. This convergence need not be uniform. Just like a sequence of real-valued functions $f_n:\mathbb R \rightarrow \mathbb R$ might converge for every $x \in \mathbb R$, it may require a different $N$ for each $x$. Similarly, if $X_n$ converges almost surely to $X$, you can still have that the convergence is not uniform -- i.e. for different $\omega$'s, you need a different $N$.
Hope this helps.
Best Answer
Without independence this is obviously wrong. If $X_n=X_1$ for all $n$ then the result you claim to have proved is clearly false since the $\lim\ sup$ is $0$, not $1$.
Assume independence. The proof for limsup does not require any probability theory.
If $\lim \sup \frac {a_n} {b_n}=1$ for some positive increasing sequence $b_n \to \infty$ then $\lim \sup \frac {c_n} {b_n}=1$ where $c_n =\max \{a_1,a_2,...,a_n\}$.