First recall the most important result in order statistics:
For every random variable $Z$ with continuous CDF $H$, $H(Z)$ is uniform on $(0,1)$.
Thus, if $(X_i)_{1\leqslant i\leqslant n}$ has continuous CDF and $(U_i)_{1\leqslant i\leqslant n}$ is an i.i.d. sample uniform on $(0,1)$, then, for every $1\leqslant i\leqslant n$ and every $x$ in $(0,1)$,
$$
P(F_i(X_{(i)})\leqslant x)=P(U_1\leqslant x)=x.
$$
And $(F(X_i))_{1\leqslant i\leqslant n}$ is distributed as $(U_i)_{1\leqslant i\leqslant n}$, thus, the CDF $G_i$ of $F(X_{(i)})$ is such that, for every $x$ in $(0,1)$,
$$
G_i(x)=P(F(X_{(i)})\leqslant x)=P(U_{(i)}\leqslant x)=\sum_{k=i}^n{n\choose k}t^k(1-t)^{n-k}.
$$
Recall a most useful result to compute expectations:
For every $(0,1)$-valued random variable $Z$ with CDF $H$,
$E[Z]=\displaystyle\int_0^1(1-H)=1-\int_0^1H$.
Hence,
$$
E[U_{(i)}]=1-\sum_{k=i}^n{n\choose k}\int_0^1t^k(1-t)^{n-k}\mathrm dt.
$$
Recall now that:
For every $k\leqslant n$,
$\displaystyle\int_0^1t^k(1-t)^{n-k}\mathrm dt=\frac1{n+1}{n\choose k}^{-1}$.
Hence,
$$
E[U_{(i)}]=1-\sum_{k=i}^n\frac1{n+1}=\frac{i}{n+1}.
$$
Recall finally the analogue for second moments of our result for expectations:
For every $(0,1)$-valued random variable $Z$ with CDF $H$,
$\displaystyle E[Z^2]=\int_0^12x(1-H(x))\mathrm dx$, that is, $\displaystyle E[Z^2]=1-2\int_0^1xH(x)\mathrm dx.$
Hence,
$$
E[U_{(i)}^2]=1-2\sum_{k=i}^n{n\choose k}\int_0^1t^{k+1}(1-t)^{n-k}\mathrm dt=1-2\sum_{k=i}^n{n\choose k}\frac1{n+2}{n+1\choose k+1}^{-1},
$$
that is,
$$
E[U_{(i)}^2]=1-\frac2{(n+2)(n+1)}\sum_{k=i}^n(k+1)=\frac{i(i+1)}{(n+1)(n+2)},
$$
from which you can probably guess an expression of $E[U_{(i)}^k]$ valid for every nonnegative integer $k$, and from which, independently, one deduces that
$$
\mathrm{var}(F(X_{(i)}))=\mathrm{var}(U_{(i)})=\frac{i(i+1)}{(n+1)(n+2)}-\left(\frac{i}{n+1}\right)^2=\frac{i(n+1-i)}{(n+1)^2(n+2)}.
$$
The answer is
$$
n!\prod_{i=1}^n g(x_i,y_i)\mathbf{1}(x_1<x_2<\ldots<x_n).
$$
To see this, first observe that
$$
\mathbb{P}(X_{(i)}\leq x_i, Y_{(i)}\leq y_i, 1 \leq i \leq n)=n!\mathbb{P}(X_1<X_2<\ldots<X_n \text{ and } X_i \leq x_i, Y_i \leq y_i, 1 \leq i \leq n).
$$
The term on the right hand side can be written as
$$
\int_{-\infty}^{x_n}\int_{-\infty}^{y_n} \left\lbrace \ldots \left[
\int_{-\infty}^{\tilde{x}_3 \wedge x_2}\int_{-\infty}^{y_2}G(x_1 \wedge \tilde{x}_2,v_1)g(\tilde{x}_2,\tilde{y}_2)d\tilde{x}_2d\tilde{y}_2
\right] \ldots\right\rbrace g(\tilde{x}_n, \tilde{y}_n)d\tilde{x}_n d \tilde{y}_n\\
=\int_{-\infty}^{x_n}\int_{-\infty}^{y_n} \left\lbrace \ldots \left[
\int_{x_1}^{\tilde{x}_3 \wedge x_2}\int_{-\infty}^{y_2}G(x_1,v_1)g(\tilde{x}_2,\tilde{y}_2)d\tilde{x}_2d\tilde{y}_2
\right] \ldots\right\rbrace g(\tilde{x}_n, \tilde{y}_n)d\tilde{x}_n d \tilde{y}_n
+R_k(\mathbf{x}_{-2}, \mathbf{y})
$$
where $R_k(\mathbf{x}_{-2}, \mathbf{y})$ is a reminder term depending only on $\mathbf{x}_{-2}=(x_1, x_3, \ldots,x_n)$ and $\mathbf{y}=(y_1, \ldots,y_n)$. The term on the left-hand side can be further re-expressed as
$$
\int_{-\infty}^{x_n}\int_{-\infty}^{y_n} \left\lbrace \ldots \left[
\int_{x_2}^{\tilde{x}_4 \wedge x_3}\int_{-\infty}^{y_3}G(x_1,v_1)G(x_2,v_2)g(\tilde{x}_3,\tilde{y}_3)d\tilde{x}_3d\tilde{y}_3
\right] \ldots\right\rbrace g(\tilde{x}_n, \tilde{y}_n)d\tilde{x}_n d \tilde{y}_n
+R_k'(\mathbf{x}_{-2}, \mathbf{y})+R_k''(\mathbf{x}_{-3}, \mathbf{y})
$$
where the reminder terms $R_k'(\mathbf{x}_{-2}, \mathbf{y})$ and $R_k''(\mathbf{x}_{-3}, \mathbf{y})$ do not depend on $x_2$ and $x_3$, respectively. Iterating the procedure, we finally obtain that
$$
n!\mathbb{P}(X_1<X_2<\ldots<X_n \text{ and } X_i \leq x_i, Y_i \leq y_i, 1 \leq i \leq n)=n! \prod_{i=1}^nG(x_i,y_i) + R_n'''(\mathbf{x}_{-n},\mathbf{y})
$$
where the reminder term $ R_n'''(\mathbf{x}_{-n},\mathbf{y})$ does not depend on $x_n$ and accounts for all the reminder terms iteratively produced. Therefore, differentiating with respect to $x_1, \ldots, x_n, y_1, \ldots,y_n$, we are left with the expression in the first display.
Best Answer
Partial answer (without the intuition part):
$$\sum_{k=1}^n G_k (x) = \sum_{i=0}^{n-1} (n-i){n \choose i} F(x)^{n-i} (1-F(x))^i=\sum_{i=0}^{n-1} n{n-1 \choose i} F(x)^{n-i} (1-F(x))^i=n F(x)\sum_{i=0}^{n-1} {n-1 \choose i} F(x)^{n-1-i} (1-F(x))^i=nF(x)(F(x)+1-F(x))^{n-1}=nF(x)$$