For a distribution of continuous type, we can neglect the possibility that two of the samples are equal, since this has probability $0$. Thus the probability density for the second smallest item being $y_2$ is the probability of getting one sample below $y_2$ times the probability density at $y_2$ times the probability of getting $n-2$ samples above $y_2$, times a factor accounting for the permutations. All $(n-2)!$ permutations of the $n-2$ samples are already being counted, so we just have to account for the $n(n-1)$ ordered choices of the smallest and second smallest item. Thus the probability density is
$$n(n-1)F(y_2)f(y_2)(1-F(y_2))^{n-2}\;.$$
We can integrate this to get rid of the derivative $f=F'$:
$$
\begin{eqnarray}
P(Y_2\gt y_2)
&=&
\int_{y_2}^\infty n(n-1)F(y)f(y)(1-F(y))^{n-2}\mathrm dy
\\
&=&
\int_{y_2}^\infty n(n-1)F(y)(1-F(y))^{n-2}\frac{\mathrm dF(y)}{\mathrm dy}\mathrm dy
\\
&=&
\int_{F(y_2)}^1n(n-1)F(1-F)^{n-2}\mathrm dF
\\
&=&
\int_0^{1-F(y_2)}n(n-1)(1-u)u^{n-2}\mathrm du
\\
&=&
n(1-F(y_2))^{n-1}-(n-1)(1-F(y_2))^n\;.
\end{eqnarray}
$$
With $W_n=nF(Y_2)$, this becomes
$$
\begin{eqnarray}
P(W_n\gt w)
&=&
n\left(1-\frac wn\right)^{n-1}-(n-1)\left(1-\frac wn\right)^n
\\
&=&
\left(1-\frac wn\right)^{n-1}\left(n-(n-1)\left(1-\frac wn\right)\right)
\\
&=&
\left(1-\frac wn\right)^{n-1}\left(1+w-\frac wn\right)\;.
\end{eqnarray}
$$
Thus the limit distribution is
$$
\lim_{n\to\infty}P(W_n\gt w)=(1+w)\mathrm e^{-w}\;,
$$
or
$$
\lim_{n\to\infty}P(W_n\lt w)=1-(1+w)\mathrm e^{-w}=\frac12w^2+O\left(w^3\right)\;.
$$
Here's a plot.
First, if $X_1, X_2, \ldots, X_n$ are IID random variables with common distribution $X$ and moment generating function $M_X(t) = \operatorname{E}[e^{tX}]$, then $$S_n = \sum_{i=1}^n X_i$$ has moment generating function $$M_{S_n}(t) = \operatorname{E}[e^{tS_n}] = \operatorname{E}[e^{t\sum_{i=1}^n X_i}] = \operatorname{E}\left[\prod_{i=1}^n e^{tX_i}\right] \overset{\text{ind}}{=} \prod_{i=1}^n \operatorname{E}[e^{tX_i}] = \prod_{i=1}^n M_X(t) = (M_X(t))^n.$$ That is to say, the MGF of a sum of $n$ IID random variables is equal to the MGF of one such random variable raised to the $n^{\rm th}$ power.
It follows from this that if $X \sim \operatorname{Exponential}(\lambda)$, where $\lambda$ is a rate parameter, then $$M_X(t) = \frac{\lambda}{\lambda - t}.$$ Your problem is the special case $\lambda = 1$. Therefore, $$M_{S_n}(t) = \left(\frac{\lambda}{\lambda - t}\right)^n,$$ where we have defined $S_n$ as the sample total as described above. How does this help us with the MGF of $Y_n$? Well, we know $$Y_n = \sqrt{n}(\bar X_n - 1).$$ We also know that $$\bar X_n = S_n/n,$$ that is to say, the sample mean is simply the sample total divided by the sample size $n$. Thus, $$M_{Y_n}(t) = \operatorname{E}[e^{t(\sqrt{n}(\bar X_n - 1))}] = \operatorname{E}[e^{(t/\sqrt{n})S_n - t\sqrt{n}}] = \operatorname{E}[e^{(t/\sqrt{n}) S_n}] \operatorname{E}[e^{-t\sqrt{n}}] = e^{-t\sqrt{n}} M_{S_n}(t/\sqrt{n}).$$ The first equality is the definition of MGF. The second follows from simple algebraic manipulation of the exponent. The third is the result of factoring out the non-random term $e^{-t\sqrt{n}}$, which is not a function of any random variable, and the fourth is the definition of MGF again. Finally, we use the earlier result for $M_{S_n}(t)$ to get $$M_{Y_n}(t) = e^{-t\sqrt{n}} \left(\frac{1}{1 - t/\sqrt{n}}\right)^n.$$
To find the limiting distribution of $Y_n$, you need to evaluate $$\lim_{n \to \infty} M_{Y_n}(t).$$ This should give you $e^{t^2/2}$, but I have left the proof as an exercise. What familiar distribution has such an MGF? Why does this make sense?
Best Answer
I'm going to denote your "maximum" $Y_n$ as $Y_{(n)}$. Observe that \begin{align*} F_{Z_n}(z) &= P(Z_n \le z) \\ &= P([1 - F_{}(Y_{(n)})] \le z/n ) \\ &= P (F_{}(Y_{(n)}) \ge 1-z/n) \\ &= P(Y_{(n)} \ge F^{-1}(1-z/n)) \\ &= 1- P(Y_{(n)} \le F^{-1}(1-z/n)) \quad \text{(continuous dist.)} \\ \end{align*}
Suppose our random sample was $Y_1,\dots,Y_n$ from a distribution with cdf $F$. Then $$ P(Y_{(n)} \le c) = P((Y_1 \le c) \ \cap \cdots \cap (Y_n \le c)) \stackrel{ind.}{=} P(Y_1 \le c) \cdots P(Y_n \le c) \stackrel{i.d.}{=} [P(Y_1 \le c)]^n = [F(c)]^n $$
Returning and using this in the main problem, we have $$ F_{Z_n}(z) = 1 - [F \left( F^{-1}(1-z/n) \right)]^n = 1- [1-z/n]^n \xrightarrow{n \to \infty} 1- e^{- z}. $$ which we recognize as Exponential rate $=1$.