Here is a partial answer when both $X_k$ and $Y_k$ are uniformly distributed in $[1, 2]$. I guess it generalizes to a broader class of distribution without much hassle.
We begin by noting that, for each $s, t \geq 0$, define
\begin{align*}
\newcommand{\Area}{\operatorname{Area}}
\mathcal{A}_n(s) &= \Bigl\{ (x, y) \in [1, 2]^2 : x < 1 + \frac{s}{n} \Bigr\}, \\
\mathcal{B}_n(s) &= \Bigl\{ (x, y) \in [1, 2]^2 : xy < 1 + \sqrt{\frac{2}{n}} \, t \Bigr\}.
\end{align*}
Then it follows that $\Area(\mathcal{A}_n(s)) \sim \frac{s}{n}$ and $\Area(\mathcal{B}_n(s)) \sim \frac{t^2}{n}$, and so, we get
\begin{align*}
\mathbb{P} \biggl( \frac{\min\{X_1,\cdots,X_n\}-1}{1/n} \geq s \biggr)
&= \mathbb{P}\bigl((X_k, Y_k) \notin \mathcal{A}_n(s) \text{ for all } k = 1, \cdots, n\bigr) \\
&= \biggl( 1 - \Area(\mathcal{A}_n(s)) \biggr)^n
\xrightarrow[n\to\infty]{} e^{-s} = 1 - F(s)
\end{align*}
and similarly
\begin{align*}
\mathbb{P} \biggl( \frac{\min\{X_1 Y_1,\cdots,X_n Y_n\}-1}{\sqrt{2/n}} \geq t \biggr)
&= \mathbb{P}\bigl((X_k, Y_k) \notin \mathcal{B}_n(t) \text{ for all } k = 1, \cdots, n\bigr) \\
&= \biggl( 1 - \Area(\mathcal{B}_n(t)) \biggr)^n
\xrightarrow[n\to\infty]{} e^{-t^2} = 1 - G(t).
\end{align*}
Finally, it follows that the probability of the joint event is
\begin{align*}
&\mathbb{P} \biggl( \biggl\{ \frac{\min\{X_1,\cdots,X_n\}-1}{1/n} \geq s \biggr\} \cap \biggl\{ \frac{\min\{X_1 Y_1,\cdots,X_n Y_n\}-1}{\sqrt{2/n}} \geq t \biggr\} \biggr) \\
&= \mathbb{P}\bigl((X_k, Y_k) \notin \mathcal{A}_n(s) \cup \mathcal{B}_n(t) \text{ for all } k = 1, \cdots, n\bigr) \\
&= \biggl( 1 - \Area(\mathcal{A}_n(s)) - \Area(\mathcal{B}_n(t)) + \Area(\mathcal{A}_n(t) \cap \mathcal{B}_n(t)) \biggr)^n.
\end{align*}
But it is easy to check that $\Area(\mathcal{A}_n(t) \cap \mathcal{B}_n(t)) = \mathcal{O}(n^{-3/2})$, hence the above converges to
\begin{align*}
= \biggl( 1 - \frac{s+o(1)}{n} - \frac{t^2+o(1)}{n} + \mathcal{O}(n^{-3/2}) \biggr)^n
\xrightarrow[n\to\infty]{} e^{-s-t^2} = (1 - F(s))(1 - G(t)).
\end{align*}
Therefore the limiting joint distribution factors and the marginal distributions are independent.
Regarding the first answer,
$$F_Y(y)=1-(1-F_X(y))^n=1-(1-y)^n$$
Regarding the second question,
$$y=-\log x_1$$
$$x_1=e^{-y}$$
$$|x_1'|=e^{-y}$$
Thus
$$f_Y(y)=e^{-y}$$
That means $Y\sim \exp(1)$
Best Answer
You are correct about the distribution function:
$$F_Y(y)=P(Y<y)=1-\prod_i \left(1-F_i(y)\right)$$
The derivative of this is more complicated. You get:
$$\begin{align}f_Y(y)&=-\frac{d}{dy}\prod_i \left(1-F_i(y)\right)\\ &=\sum_{j} \left[f_j(y)\prod_{i\neq j}\left(1-F_i(y)\right)\right] \end{align}$$
Each term in the sum is the probability density for $X_j$ at $y$ times the probability that each of the other variables are greater than $y.$