Your estimate gives that for each positive $\varepsilon$, we have
$$\sum_i\mathbb P(M_{2^{i}}>(1+\varepsilon)\sqrt 2\sqrt{i+1})<\infty$$
(we use the fact that $\mathbb P(M_n>x)\leqslant n\mathbb P(X_1\gt x)$).
We thus deduce that $\limsup_iM_{2^{i}}/(\sqrt{2(i+1)})\leqslant 1+\varepsilon$ almost surely. Taking $\varepsilon:=1/k$, we get $\limsup_iM_{2^{i}}/(\sqrt{2(i+1)})\leqslant 1$ almost surely. To conclude, notice that if $2^i\leqslant n\lt 2^{i+1}$,
$$\frac{M_n}{\sqrt{2\log n}}\leqslant \frac{M_{2^{i+1}}}{\sqrt{2i}}.$$
For the $\liminf$, define for a fixed positive $\varepsilon$, $$A_n:=\left\{\frac{M_n}{\sqrt{2\log n}}\lt 1-\varepsilon\right\}.$$
A use of the estimate on the tail of the normal distribution show that the series $\sum_n \mathbb P(A_n)$ is convergent, hence by the Borel-Cantelli lemma, we have $\mathbb P\left(\limsup_n A_n\right)=0$. This means that for almost every $\omega$, we can find $N=N(\omega)$ such that if $n\geqslant N(\omega)$, then $$\frac{M_n(\omega)}{\sqrt{2\log n}}\geqslant 1-\varepsilon.$$
This implies that $$\liminf_{n\to \infty}\frac{M_n(\omega)}{\sqrt{2\log n}}\geqslant 1-\varepsilon \quad \mbox{a.e.}$$
Here again, taking $\varepsilon:=1/k$, we obtain that
$$\liminf_{n\to \infty}\frac{M_n(\omega)}{\sqrt{2\log n}}\geqslant 1\quad \mbox{a.e.}$$
This can be proved by application of the 1st Borel-Cantelli lemma which states that: If $(A_n)_{n\in\mathbb{N}}$ is a (not necessarily independent) sequence of events such that $\sum_{n\in\mathbb{N}}\mathbb{P}(A_n)<\infty$, then $\mathbb{P}(A_n\ happens\ infinitely\ often)=0$. We will write $A_n\ i.o.$ to mean $A_n\ happens\ infinitely\ often$.
Now consider that
\begin{align}
limsup\frac{max\{X_1, ..., X_n\}}{n}\leq 1\ \ a.s. & \Leftrightarrow \mathbb{P}(limsup\frac{max\{X_1, ..., X_n\}}{n} > 1)=0 \\
& \Leftrightarrow\mathbb{P}(\forall m>N,\ sup_{n>m}\frac{max\{X_1, ..., X_n\}}{n}>1)=0 \\
& \Leftrightarrow\mathbb{P}(\frac{max\{X_1, ..., X_n\}}{n}>1\ \ \ i.o.)=0
\end{align}
This final implication can be seen from the fact that if $\frac{max\{X_1, ..., X_n\}}{n}>1$ does not happen for infinitely many $n$ (say that the largest $n$ for which this is true is $n'$), then $sup_{n>n'}\frac{max\{X_1, ..., X_n\}}{n}\leq 1$.
Hence applying the Borel-Cantelli lemma, we are left to show that $\sum_{n\in\mathbb{N}}\mathbb{P}(\frac{max\{X_1, ..., X_n\}}{n}>1)<\infty$.
We can calculate that:
\begin{align}
\mathbb{P}(\frac{max\{X_1, ..., X_n\}}{n}>1)&=\mathbb{P}(max\{X_1, ..., X_n\}>n)\\
&=\mathbb{P}(\{X_1>n\}\cup \{X_2>n\}\cup ... \cup \{X_n>n\})\\
&=\mathbb{P}(X_1>n)+\mathbb{P}(X_2>n)+...+\mathbb{P}(X_n>n)\\
&=n\mathbb{P}(X_1>n)
\end{align}
since the $X_i$s are i.i.d
And we see exactly why the conditions were given as they were, since the given condition $\sum_{n\in\mathbb{N}}n\mathbb{P}(X_1>n)<\infty$ implies that $\sum_{n\in\mathbb{N}}\mathbb{P}(\frac{max\{X_1, ..., X_n\}}{n}>1)<\infty$, and so we conclude, by the Borel-Cantelli lemma, that $limsup\frac{max\{X_1, ..., X_n\}}{n}\leq 1$ almost surely
Best Answer
If $F(x) = 1 - e^{-x}$ for $x > 0$ is the CDF of each $X_i$, the CDF of $M_n$ is $F_{M_n}(x) = F(x)^n = (1 - e^{-x})^n$ for $x > 0$. Note that $\ln(F_{M_n}(x)) = n \ln(1 - e^{-x})$ and since $- t - t^2 < \ln(1-t) < -t$ for $0 < t < .683$, for any $c>0$ we have $-n^{1-c} - n^{1-2c} \le \ln(F_(M_n)(c \ln n)) \le -n^{1-c}$ for $n$ large enough. If $c < 1$, this says $P\left( \frac{M_n}{\ln n} \le c\right) \le e^{-n^{1-c}}$, and $\sum_n e^{-n^{1-c}} < \infty$ so almost surely only finitely many $\frac{M_n}{\ln n} \le c$. If $c > 1$, $P(\left( \frac{M_n}{\ln n} \le c \right) \ge e^{-n^{1-c} - n^{1-2c}} \to 1$ as $n \to \infty$, so almost surely infinitely many $\frac{M_n}{\ln n} \le c$. Thus $\lim \sup_n \frac{M_n}{\ln n} = 1$ almost surely. However, it's not so clear to me that $\lim \inf_n \frac{M_n}{\ln n} = 1$ almost surely.