Let $M_n=\max\{X_k;1\leqslant k\leqslant n\}$ and let us first recall how the first order asymptotics of $M_n$ obtains. For every $x$,
$$
P[M_n\leqslant x]=P[X_1\leqslant x]^n,
$$
and standard estimates of the gaussian tail show that, when $x\to\infty$,
$$
P[X_1\gt x]=1/\theta(x),\qquad \theta(x)\sim x\sqrt{2\pi}\mathrm e^{x^2/2}.
$$
Thus, if $\theta(u_n)\ll n$, then $P[M_n\leqslant u_n]\to0$ while, if $\theta(v_n)\gg n$, then $P[M_n\leqslant v_n]\to1$. This holds with $u_n=(1-\varepsilon)\sqrt{2\log n}$ and $v_n=(1+\varepsilon)\sqrt{2\log n}$, for every positive $\varepsilon$, hence $M_n/\sqrt{2\log n}$ converges in probability to $1$.
To go further, assume that $x_n=(1+z_n)\sqrt{2\log n}$, with $z_n\to0$. Then,
$$
n^{-1}\theta(x_n)\sim2\sqrt\pi\exp\left( (2z_n+z_n^2)\log n+\tfrac12\log\log n\right).
$$
In particular, if $2z_n\log n=t-\tfrac12\log\log n$ for some fixed $t$, then $n^{-1}\theta(x_n)\sim\sqrt{4\pi}\mathrm e^{t}$ hence $P[M_n\leqslant x_n]\to\exp(-\mathrm e^{-t}/\sqrt{4\pi})$. This means that
$$
T_n=2\log n\left(\frac{M_n}{\sqrt{2\log n}}-1\right)+\frac12\log\log n+\frac12\log(4\pi)
$$
converges in distribution to a random variable $T$ such that, for every $t$,
$$
P[T\leqslant t]=\exp(-\mathrm e^{-t}).
$$
In particular,
$$
U_n=\frac{\log n}{\log\log n}\left(\frac{M_n}{\sqrt{2\log n}}-1\right)\to-\frac14\ \text{in probability.}
$$
Edit: For every $n\geqslant2$, consider the random variable
$$
V_n=\frac{\log n}{\log\log n}\left(\frac{X_n}{\sqrt{2\log n}}-1\right).
$$
The asymptotics on the gaussian tail used above shows that, for every fixed $t$,
$$
P[V_n\geqslant t]\sim\frac1{2\sqrt\pi\cdot n\cdot(\log n)^{1/2+2t}}.
$$
If $t\lt1/4$, the series $\sum\limits_nP[V_n\geqslant t]$ diverges hence Borel-Cantelli lemma (difficult part) shows that, almost surely $V_n\geqslant t$ for infinitely many $n$. Since $U_n\geqslant V_n$, almost surely $U_n\geqslant t$ for infinitely many $n$.
If $t\gt1/4$, the series $\sum\limits_nP[V_n\geqslant t]$ converges hence Borel-Cantelli lemma (easy part) shows that, almost surely $V_n\leqslant t$ for every $n$ large enough. Thus, $V_n\leqslant t$ for every $n$ with positive probability, hence $U_n\leqslant t$ for every $n$ with positive probability. Since $M_n\to\infty$ almost surely, asymptotically $U_n$ does not depend on $(X_i)_{i\leqslant k}$, for every $k$. Thus, $\limsup U_n$ is an asymptotic random variable and $[\limsup U_n\leqslant t]$ has probability $0$ or $1$.
Finally,
$$
\limsup\limits_{n\to\infty}U_n=+\frac14\ \text{almost surely.}
$$
The solution was mostly covered by the comments, but here is a more complete answer:
Your solution does not show a.s. convergence, as convergence in distribution does not imply a.s. convergence. What needs to be shown is that (denote $\max(X_1, \dots X_n)$ by $M_n$)
$$
P(\lim_{n \rightarrow \infty} M_n = a) = 1
$$
This happens to be the equivalent of (this is not directly evident but can be shown)
$$
P(\mid M_n -a \mid > \epsilon \text{ infinitely often as } n\rightarrow \infty) = 0
$$
By Borel Cantelli Lemma, this holds if
$$
\sum^{\infty}_{n=1}P(\mid M_n -a \mid > \epsilon) <\infty
$$
As you showed yourself, $P(\mid M_n -a \mid > \epsilon) = \left(\frac{a-\epsilon}{a} \right)^n$ so you have
$$
\sum^{\infty}_{n=1}P(\mid M_n -a \mid > \epsilon) = \frac{a}{\epsilon}<\infty
$$
and we are done.
Best Answer
For a standard exponential, we know that $Z_n=Y_n-\log(n)$ converges in distribution to a nondegenerate distribution (a Gumbel), so this means $\frac{Z_n}{\log(n)}$ converges almost surely to zero, which in turn means that $\frac{Y_n}{\log(n)}$ converges almost surely to $1.$
So the almost-sure convergence of something like this follows from the extreme value distribution. In general, for a distribution with an infinite tail that decays faster than a power law, we have that $\frac{Y_n-b_n}{a_n}$ converges in distribution to a Gumbel. For something like a Gamma, with a pure exponential tail $\sim e^{-x/\theta}$, we can work out that we have $a_n=\theta$ and $b_n$ to leading order in $n$ is $\theta\log(n).$ So for a Gamma with PDF $\frac{1}{\Gamma(\alpha) \theta^\alpha}x^{\alpha-1}e^{-x/\theta},$ $\frac{Y_n}{\log(n)}$ converges almost surely to $\theta.$