A mostly-worked-out answer to the lower bound in part a:
$$E[\max_i X_i]=E[\max_i X_i 1_{\max_i X_i \geq 0}]+E[\max_i X_i 1_{\max_i X_i<0}].$$
We want to throw out that negative piece. Intuitively, it is unlikely to happen at all and it has bounded expectation. More rigorously, it goes to zero in probability (the probability of it being nonzero is $2^{-n}$) and is pointwise decreasing in magnitude, so by dominated convergence
$$E[\max_i X_i] \geq E[\max_i X_i 1_{\max_i X_i \geq 0}] + o(1) \\
=\int_0^\infty 1-\Phi(t)^n dt + o(1).$$
using the hint and a standard fact from Lebesgue integration of nonnegative functions. Denote the first term by $I$.
Next
$$I \geq \int_0^{\sqrt{2 \log(n)}} 1-\Phi(t)^n dt$$
by simply throwing out regions of positive area.
On $[0,1]$ we have the simple bound $1-\Phi(t)^n \geq 1-\Phi(1)^n$. On $[1,\sqrt{2 \log(n)}]$ we have the bound $\Phi(t) \leq 1-\frac{1}{\sqrt{2 \pi}} e^{-t^2/2}$. (Cf. https://mikespivey.wordpress.com/2011/10/21/normaltails/) Hence
$$I \geq 1-\Phi(1)^n + \int_1^{\sqrt{2 \log(n)}} 1-\left ( 1-\frac{1}{\sqrt{2 \pi}} e^{-t^2/2} \right )^n dt.$$
As for the remaining piece, we're integrating a decreasing function, so we get a lower bound by substituting in the upper limit:
$$I \geq 1-\Phi(1)^n+\int_1^{\sqrt{2 \log(n)}} 1-\left ( 1-\frac{1}{\sqrt{2 \pi}} n^{-1} \right )^n dt.$$
The sequence of numbers in the integrand converges to $1-e^{-\frac{1}{\sqrt{2 \pi}}}>0$, so it is bounded below by $1-e^{-\frac{1}{\sqrt{2 \pi}}}-\varepsilon=:C$ for large enough $n$ depending on $\varepsilon$. Then we get the bound
$$I \geq C(\sqrt{2 \log(n)}-1)+1-\Phi(1)^n.$$
Returning to the original problem we have
$$E[\max_i X_i] \geq C(\sqrt{2 \log(n)}-1)+1-\Phi(1)^n+o(1)$$
which gives the lower bound for part a for sufficiently large $n$. A finite collection of $n$ can always be handled (why?) so we are done.
To solve part b we would need to be able to repeat the derivation to get $C=1$, and I'm not really sure how to do that. One idea would be to change variables to $u=\Phi(t)$, which would give
$$\int_0^\infty 1-\Phi(t)^n dt = \int_{1/2}^1 (1-u^n)\frac{dt}{du} du.$$
where $\frac{dt}{du}$ is the reciprocal of the normal density, written as a function of the normal CDF itself. Perhaps it is possible to get an appropriate series expansion for this quantity to get the result.
The definition from Wikipedia is about bounds for sets.
Here you do not have a set but just an inequality which depends on may parameters ($n$, $A_1,\ldots,A_n$, $P$).
There are choices of the parameters for which the inequality becomes even an equality (for instance $A_1=\cdots=A_n$ for the first three inequalities, or $n=1$ for all inequalities); so in some sense the inequality is “tight”. But there are also other choices of parameters for which the inequality is not an equality, so in some sense simultaneously the inequality is not “tight”.
Best Answer
I can give an upper bound of $O\left(\frac{\log n}{n}\right)$.
For $t\in(0,1)$ let $N_t$ be the number of samples among $X_1,\dots,X_n$ that are at most $t$. Then $\mathbb EN_t=nt$ and pairwise independence implies that $\operatorname{Var}N_t=nt(1-t)$. Then by the second moment method $$\mathbb P(\min X_i\leq t)=\mathbb P(N_t>0)\geq\frac{\mathbb E[N_t]^2}{\mathbb E[N_t^2]}=\frac{nt}{nt+(1-t)}.$$ So we have $$\mathbb E[\min X_i]=\int_0^1\mathbb P(\min X_i>t)\mathop{}\!\mathrm{d}t\leq\int_0^1\frac{1-t}{(n-1)t+1}\mathop{}\!\mathrm{d}t=\frac{n\log n}{(n-1)^2}-\frac{1}{n-1},$$ establishing the upper bound.