Let $M_n=\max\{X_k;1\leqslant k\leqslant n\}$ and let us first recall how the first order asymptotics of $M_n$ obtains. For every $x$,
$$
P[M_n\leqslant x]=P[X_1\leqslant x]^n,
$$
and standard estimates of the gaussian tail show that, when $x\to\infty$,
$$
P[X_1\gt x]=1/\theta(x),\qquad \theta(x)\sim x\sqrt{2\pi}\mathrm e^{x^2/2}.
$$
Thus, if $\theta(u_n)\ll n$, then $P[M_n\leqslant u_n]\to0$ while, if $\theta(v_n)\gg n$, then $P[M_n\leqslant v_n]\to1$. This holds with $u_n=(1-\varepsilon)\sqrt{2\log n}$ and $v_n=(1+\varepsilon)\sqrt{2\log n}$, for every positive $\varepsilon$, hence $M_n/\sqrt{2\log n}$ converges in probability to $1$.
To go further, assume that $x_n=(1+z_n)\sqrt{2\log n}$, with $z_n\to0$. Then,
$$
n^{-1}\theta(x_n)\sim2\sqrt\pi\exp\left( (2z_n+z_n^2)\log n+\tfrac12\log\log n\right).
$$
In particular, if $2z_n\log n=t-\tfrac12\log\log n$ for some fixed $t$, then $n^{-1}\theta(x_n)\sim\sqrt{4\pi}\mathrm e^{t}$ hence $P[M_n\leqslant x_n]\to\exp(-\mathrm e^{-t}/\sqrt{4\pi})$. This means that
$$
T_n=2\log n\left(\frac{M_n}{\sqrt{2\log n}}-1\right)+\frac12\log\log n+\frac12\log(4\pi)
$$
converges in distribution to a random variable $T$ such that, for every $t$,
$$
P[T\leqslant t]=\exp(-\mathrm e^{-t}).
$$
In particular,
$$
U_n=\frac{\log n}{\log\log n}\left(\frac{M_n}{\sqrt{2\log n}}-1\right)\to-\frac14\ \text{in probability.}
$$
Edit: For every $n\geqslant2$, consider the random variable
$$
V_n=\frac{\log n}{\log\log n}\left(\frac{X_n}{\sqrt{2\log n}}-1\right).
$$
The asymptotics on the gaussian tail used above shows that, for every fixed $t$,
$$
P[V_n\geqslant t]\sim\frac1{2\sqrt\pi\cdot n\cdot(\log n)^{1/2+2t}}.
$$
If $t\lt1/4$, the series $\sum\limits_nP[V_n\geqslant t]$ diverges hence Borel-Cantelli lemma (difficult part) shows that, almost surely $V_n\geqslant t$ for infinitely many $n$. Since $U_n\geqslant V_n$, almost surely $U_n\geqslant t$ for infinitely many $n$.
If $t\gt1/4$, the series $\sum\limits_nP[V_n\geqslant t]$ converges hence Borel-Cantelli lemma (easy part) shows that, almost surely $V_n\leqslant t$ for every $n$ large enough. Thus, $V_n\leqslant t$ for every $n$ with positive probability, hence $U_n\leqslant t$ for every $n$ with positive probability. Since $M_n\to\infty$ almost surely, asymptotically $U_n$ does not depend on $(X_i)_{i\leqslant k}$, for every $k$. Thus, $\limsup U_n$ is an asymptotic random variable and $[\limsup U_n\leqslant t]$ has probability $0$ or $1$.
Finally,
$$
\limsup\limits_{n\to\infty}U_n=+\frac14\ \text{almost surely.}
$$
To solve problems with maxima of iid random variables, the internal automatic pilot should tell us to apply the equality:
$$\mathbb{P}\left( \max_{1 \leq i \leq n} |X_i| \geq x \right) = 1 - \left( \mathbb{P}( |X_1| < x) \right)^n.$$
Following this idea, we rewrite the quantity of interest, by integration by parts, as
$$\mathbb{E} \left[ \max_{1 \leq i \leq n} |X_i| \right] = \int_0^\infty \mathbb{P} \left( \max_{1 \leq i \leq n} |X_i| \geq x \right) dx.$$
Before plugging in the autopilot identity, we should also make sure to know what happens to the term $\mathbb{P}(| X_1 |< x)= 1 - \mathbb{P}(|X_1| \geq x)$. Since we know that the random variables are normalized Gaussians:
$$\mathbb{P}(|X_1| \geq x) = \frac{2}{\sqrt{2\pi}}\int_0^{\infty} e^{-\frac{|y+x|^2}{2}} dy \leq e^{- |x|^2}.$$
But this bound goes in the wrong direction (it will lead us to an upper bound on the average, not a lower bound. Instead, we will use
$$\mathbb{P}(|X_1| \geq x) \geq \frac{2}{\sqrt{2\pi}}\int_0^{1} e^{-\frac{|y+x|^2}{2}} dy \geq c e^{- |x|^2},$$
for some $c>0$. In this way we find that
\begin{align*}
\mathbb{E} \left[ \max_{1 \leq i \leq n} |X_i| \right] & = \int_0^\infty 1 - \left( 1 - \mathbb{P}( |X_1| \geq x) \right)^n d x \\
& \geq \int_0^\infty 1 - \left( 1 - ce^{-|x|^2} \right)^n d x.
\end{align*}
We are almost done. We see that the integrand is close to $1$ for $0 < x \ll 1$ and close to $0$ for $x \gg 1$. Bernoulli's formula for the exponential tells us that the location at which we pass from $0$ to $1$ (the front, so to speak) is at rughly $x\sim \sqrt{\log(n)}.$ Of course we have no clue how steep the front is. Luckily, we do not need any particular understanding: a magical change of variables $x =\sqrt{\log(n)} u$ removes all problems.
\begin{align*} \int_0^\infty 1 - \left( 1 - ce^{-|x|^2} \right)^n d x & = \sqrt{log(n)} \int_0^\infty 1 - \left( 1 - \frac{c}{n^{u^2}} \right)^n d u \\
& \geq\sqrt{log(n)} \int_0^\infty 1- \alpha^{\frac{1}{u^2}} du,
\end{align*}
where $\alpha \in (0, 1)$ depends on $c$, but not on $n$ or $u$. Now, for large $u$, where exists a $\beta > 0$ such that $\alpha^{\frac{1}{u^2}} \leq 1 - \beta \frac{1}{u^2}$ (this is a Taylor expansion). Overall:
\begin{align*} \mathbb{E} \left[ \max_{1 \leq i \leq n} |X_i| \right] & \geq \sqrt{log(n)} \int_0^\infty \frac{\beta}{u^2} du,
\end{align*}
which proves the lower bound.
Best Answer
$$ \Pr(\max>x) = 1-\Pr(\max\le x) = 1-\Pr(\text{for }i=1,\ldots, n,\quad X_i\le x) $$ $$ = 1-\Big(\Pr(X_1\le x)\Big)^n = 1 - \Big(\Phi(x)\Big)^n. $$
If you mean a computationally efficient lower bound, it might make sense to ask about computationally efficient upper bounds on $\Phi(x)$. Things like that are known.