A mostly-worked-out answer to the lower bound in part a:
$$E[\max_i X_i]=E[\max_i X_i 1_{\max_i X_i \geq 0}]+E[\max_i X_i 1_{\max_i X_i<0}].$$
We want to throw out that negative piece. Intuitively, it is unlikely to happen at all and it has bounded expectation. More rigorously, it goes to zero in probability (the probability of it being nonzero is $2^{-n}$) and is pointwise decreasing in magnitude, so by dominated convergence
$$E[\max_i X_i] \geq E[\max_i X_i 1_{\max_i X_i \geq 0}] + o(1) \\
=\int_0^\infty 1-\Phi(t)^n dt + o(1).$$
using the hint and a standard fact from Lebesgue integration of nonnegative functions. Denote the first term by $I$.
Next
$$I \geq \int_0^{\sqrt{2 \log(n)}} 1-\Phi(t)^n dt$$
by simply throwing out regions of positive area.
On $[0,1]$ we have the simple bound $1-\Phi(t)^n \geq 1-\Phi(1)^n$. On $[1,\sqrt{2 \log(n)}]$ we have the bound $\Phi(t) \leq 1-\frac{1}{\sqrt{2 \pi}} e^{-t^2/2}$. (Cf. https://mikespivey.wordpress.com/2011/10/21/normaltails/) Hence
$$I \geq 1-\Phi(1)^n + \int_1^{\sqrt{2 \log(n)}} 1-\left ( 1-\frac{1}{\sqrt{2 \pi}} e^{-t^2/2} \right )^n dt.$$
As for the remaining piece, we're integrating a decreasing function, so we get a lower bound by substituting in the upper limit:
$$I \geq 1-\Phi(1)^n+\int_1^{\sqrt{2 \log(n)}} 1-\left ( 1-\frac{1}{\sqrt{2 \pi}} n^{-1} \right )^n dt.$$
The sequence of numbers in the integrand converges to $1-e^{-\frac{1}{\sqrt{2 \pi}}}>0$, so it is bounded below by $1-e^{-\frac{1}{\sqrt{2 \pi}}}-\varepsilon=:C$ for large enough $n$ depending on $\varepsilon$. Then we get the bound
$$I \geq C(\sqrt{2 \log(n)}-1)+1-\Phi(1)^n.$$
Returning to the original problem we have
$$E[\max_i X_i] \geq C(\sqrt{2 \log(n)}-1)+1-\Phi(1)^n+o(1)$$
which gives the lower bound for part a for sufficiently large $n$. A finite collection of $n$ can always be handled (why?) so we are done.
To solve part b we would need to be able to repeat the derivation to get $C=1$, and I'm not really sure how to do that. One idea would be to change variables to $u=\Phi(t)$, which would give
$$\int_0^\infty 1-\Phi(t)^n dt = \int_{1/2}^1 (1-u^n)\frac{dt}{du} du.$$
where $\frac{dt}{du}$ is the reciprocal of the normal density, written as a function of the normal CDF itself. Perhaps it is possible to get an appropriate series expansion for this quantity to get the result.
To solve problems with maxima of iid random variables, the internal automatic pilot should tell us to apply the equality:
$$\mathbb{P}\left( \max_{1 \leq i \leq n} |X_i| \geq x \right) = 1 - \left( \mathbb{P}( |X_1| < x) \right)^n.$$
Following this idea, we rewrite the quantity of interest, by integration by parts, as
$$\mathbb{E} \left[ \max_{1 \leq i \leq n} |X_i| \right] = \int_0^\infty \mathbb{P} \left( \max_{1 \leq i \leq n} |X_i| \geq x \right) dx.$$
Before plugging in the autopilot identity, we should also make sure to know what happens to the term $\mathbb{P}(| X_1 |< x)= 1 - \mathbb{P}(|X_1| \geq x)$. Since we know that the random variables are normalized Gaussians:
$$\mathbb{P}(|X_1| \geq x) = \frac{2}{\sqrt{2\pi}}\int_0^{\infty} e^{-\frac{|y+x|^2}{2}} dy \leq e^{- |x|^2}.$$
But this bound goes in the wrong direction (it will lead us to an upper bound on the average, not a lower bound. Instead, we will use
$$\mathbb{P}(|X_1| \geq x) \geq \frac{2}{\sqrt{2\pi}}\int_0^{1} e^{-\frac{|y+x|^2}{2}} dy \geq c e^{- |x|^2},$$
for some $c>0$. In this way we find that
\begin{align*}
\mathbb{E} \left[ \max_{1 \leq i \leq n} |X_i| \right] & = \int_0^\infty 1 - \left( 1 - \mathbb{P}( |X_1| \geq x) \right)^n d x \\
& \geq \int_0^\infty 1 - \left( 1 - ce^{-|x|^2} \right)^n d x.
\end{align*}
We are almost done. We see that the integrand is close to $1$ for $0 < x \ll 1$ and close to $0$ for $x \gg 1$. Bernoulli's formula for the exponential tells us that the location at which we pass from $0$ to $1$ (the front, so to speak) is at rughly $x\sim \sqrt{\log(n)}.$ Of course we have no clue how steep the front is. Luckily, we do not need any particular understanding: a magical change of variables $x =\sqrt{\log(n)} u$ removes all problems.
\begin{align*} \int_0^\infty 1 - \left( 1 - ce^{-|x|^2} \right)^n d x & = \sqrt{log(n)} \int_0^\infty 1 - \left( 1 - \frac{c}{n^{u^2}} \right)^n d u \\
& \geq\sqrt{log(n)} \int_0^\infty 1- \alpha^{\frac{1}{u^2}} du,
\end{align*}
where $\alpha \in (0, 1)$ depends on $c$, but not on $n$ or $u$. Now, for large $u$, where exists a $\beta > 0$ such that $\alpha^{\frac{1}{u^2}} \leq 1 - \beta \frac{1}{u^2}$ (this is a Taylor expansion). Overall:
\begin{align*} \mathbb{E} \left[ \max_{1 \leq i \leq n} |X_i| \right] & \geq \sqrt{log(n)} \int_0^\infty \frac{\beta}{u^2} du,
\end{align*}
which proves the lower bound.
Best Answer
Without loss of generality, we may assume $\sigma^2=1$ (just note that $Y_i := X_i/\sigma$ are independent standard Gaussian random variables). By Bernoulli's inequaliy, we have
$$(1-\mathbb{P}(|X_1| \geq x))^n \geq 1-n \mathbb{P}(|X_1| \geq x).$$ Hence,
$$\begin{align*} \mathbb{E}(Z_n) &= \int_0^{\infty}(1-(1-\mathbb{P}(|X_1| \geq x))^n) \, dx \\ &\leq c + \int_c^{\infty} (1-(1-\mathbb{P}(|X_1| \geq x))^n) \, dx \\ &\leq c+ n \int_c^{\infty} \mathbb{P}(|X_1| \geq x) \, dx \end{align*}$$
for any constant $c>0$. Using the tail estimate for $X_1$, we find
$$\begin{align*} \mathbb{E}(Z_n) &\leq c+ n \sqrt{\frac{2}{\pi}} \int_c^{\infty} \frac{1}{x} \exp \left(- \frac{x^2}{2} \right) \, dx \\ &\leq c+\frac{n}{c} \sqrt{\frac{2}{\pi}} \int_c^{\infty} \exp \left(- \frac{x^2}{2} \right) \, dx. \end{align*}$$
If we choose $c:= \sqrt{2 \log n}$, then $c \geq 1$ for $n \geq 2$ and therefore
$$\begin{align*} \mathbb{E}(Z_n) &\leq c + \frac{n}{c} \sqrt{\frac{2}{\pi}} \int_c^{\infty}x \exp \left(- \frac{x^2}{2} \right) \, dx \\ &= c + \frac{n}{c} \sqrt{\frac{2}{\pi}} e^{-c^2/2} \\ &= \sqrt{2 \log n} + \sqrt{\frac{2}{\pi}} \frac{1}{\sqrt{2 \log n}}. \end{align*}$$
Since $\sqrt{\frac{2}{\pi}}<1<4$, this finishes the proof.