Without loss of generality, we may assume $\sigma^2=1$ (just note that $Y_i := X_i/\sigma$ are independent standard Gaussian random variables). By Bernoulli's inequaliy, we have
$$(1-\mathbb{P}(|X_1| \geq x))^n \geq 1-n \mathbb{P}(|X_1| \geq x).$$
Hence,
$$\begin{align*} \mathbb{E}(Z_n) &= \int_0^{\infty}(1-(1-\mathbb{P}(|X_1| \geq x))^n) \, dx \\ &\leq c + \int_c^{\infty} (1-(1-\mathbb{P}(|X_1| \geq x))^n) \, dx \\ &\leq c+ n \int_c^{\infty} \mathbb{P}(|X_1| \geq x) \, dx \end{align*}$$
for any constant $c>0$. Using the tail estimate for $X_1$, we find
$$\begin{align*} \mathbb{E}(Z_n) &\leq c+ n \sqrt{\frac{2}{\pi}} \int_c^{\infty} \frac{1}{x} \exp \left(- \frac{x^2}{2} \right) \, dx \\ &\leq c+\frac{n}{c} \sqrt{\frac{2}{\pi}} \int_c^{\infty} \exp \left(- \frac{x^2}{2} \right) \, dx. \end{align*}$$
If we choose $c:= \sqrt{2 \log n}$, then $c \geq 1$ for $n \geq 2$ and therefore
$$\begin{align*} \mathbb{E}(Z_n) &\leq c + \frac{n}{c} \sqrt{\frac{2}{\pi}} \int_c^{\infty}x \exp \left(- \frac{x^2}{2} \right) \, dx \\ &= c + \frac{n}{c} \sqrt{\frac{2}{\pi}} e^{-c^2/2} \\ &= \sqrt{2 \log n} + \sqrt{\frac{2}{\pi}} \frac{1}{\sqrt{2 \log n}}. \end{align*}$$
Since $\sqrt{\frac{2}{\pi}}<1<4$, this finishes the proof.
General result: Let $G(y)=P(|X|\le y)=\frac{2}{\sqrt{2\pi}}\int_0^ye^{-\frac{u^2}{2}}du$. $G^2(y)=\frac{2}{\pi}\int_0^y\int_0^ye^{-\frac{u^2+v^2}{2}}dudv$.
Switch to polar coordinates (note) and get $G^2(y)\gt \int_0^ye^{-\frac{r^2}{2}}rdr=1-e^{-\frac{y^2}{2}}$ or $G(y)\gt (1-e^{-\frac{y^2}{2}})^\frac{1}{2}$.
In general $P(max|X_1|,|X_2|,....,|X_M|\le y)=G^M(y)\gt (1-e^{-\frac{y^2}{2}})^\frac{M}{2}$. Therefore $P(max|X_1|,|X_2|,....,|X_M|\gt y)\lt 1-(1-e^{-\frac{y^2}{2}})^\frac{M}{2}=1-1+\frac{M}{2}e^{-\frac{y^2}{2}}-...\lt \frac{M}{2}e^{-\frac{y^2}{2}}$
Note: The domain of integration is a square $y$ by $y$. The switch to polar coordinates is an integration over the maximum sector within the square, radius $=y$ and angle $=\frac{\pi}{2}$.
Best Answer
To solve problems with maxima of iid random variables, the internal automatic pilot should tell us to apply the equality: $$\mathbb{P}\left( \max_{1 \leq i \leq n} |X_i| \geq x \right) = 1 - \left( \mathbb{P}( |X_1| < x) \right)^n.$$ Following this idea, we rewrite the quantity of interest, by integration by parts, as $$\mathbb{E} \left[ \max_{1 \leq i \leq n} |X_i| \right] = \int_0^\infty \mathbb{P} \left( \max_{1 \leq i \leq n} |X_i| \geq x \right) dx.$$ Before plugging in the autopilot identity, we should also make sure to know what happens to the term $\mathbb{P}(| X_1 |< x)= 1 - \mathbb{P}(|X_1| \geq x)$. Since we know that the random variables are normalized Gaussians: $$\mathbb{P}(|X_1| \geq x) = \frac{2}{\sqrt{2\pi}}\int_0^{\infty} e^{-\frac{|y+x|^2}{2}} dy \leq e^{- |x|^2}.$$ But this bound goes in the wrong direction (it will lead us to an upper bound on the average, not a lower bound. Instead, we will use $$\mathbb{P}(|X_1| \geq x) \geq \frac{2}{\sqrt{2\pi}}\int_0^{1} e^{-\frac{|y+x|^2}{2}} dy \geq c e^{- |x|^2},$$ for some $c>0$. In this way we find that \begin{align*} \mathbb{E} \left[ \max_{1 \leq i \leq n} |X_i| \right] & = \int_0^\infty 1 - \left( 1 - \mathbb{P}( |X_1| \geq x) \right)^n d x \\ & \geq \int_0^\infty 1 - \left( 1 - ce^{-|x|^2} \right)^n d x. \end{align*} We are almost done. We see that the integrand is close to $1$ for $0 < x \ll 1$ and close to $0$ for $x \gg 1$. Bernoulli's formula for the exponential tells us that the location at which we pass from $0$ to $1$ (the front, so to speak) is at rughly $x\sim \sqrt{\log(n)}.$ Of course we have no clue how steep the front is. Luckily, we do not need any particular understanding: a magical change of variables $x =\sqrt{\log(n)} u$ removes all problems. \begin{align*} \int_0^\infty 1 - \left( 1 - ce^{-|x|^2} \right)^n d x & = \sqrt{log(n)} \int_0^\infty 1 - \left( 1 - \frac{c}{n^{u^2}} \right)^n d u \\ & \geq\sqrt{log(n)} \int_0^\infty 1- \alpha^{\frac{1}{u^2}} du, \end{align*} where $\alpha \in (0, 1)$ depends on $c$, but not on $n$ or $u$. Now, for large $u$, where exists a $\beta > 0$ such that $\alpha^{\frac{1}{u^2}} \leq 1 - \beta \frac{1}{u^2}$ (this is a Taylor expansion). Overall: \begin{align*} \mathbb{E} \left[ \max_{1 \leq i \leq n} |X_i| \right] & \geq \sqrt{log(n)} \int_0^\infty \frac{\beta}{u^2} du, \end{align*} which proves the lower bound.