In chapter 2 of the Matrix Cookbook there is a nice review of matrix calculus stuff that gives a lot of useful identities that help with problems one would encounter doing probability and statistics, including rules to help differentiate the multivariate Gaussian likelihood.
If you have a random vector ${\boldsymbol y}$ that is multivariate normal with mean vector ${\boldsymbol \mu}$ and covariance matrix ${\boldsymbol \Sigma}$, then use equation (86) in the matrix cookbook to find that the gradient of the log likelihood ${\bf L}$ with respect to ${\boldsymbol \mu}$ is
$$\begin{align}
\frac{ \partial {\bf L} }{ \partial {\boldsymbol \mu}}
&= -\frac{1}{2} \left(
\frac{\partial \left( {\boldsymbol y} - {\boldsymbol \mu} \right)'
{\boldsymbol \Sigma}^{-1} \left( {\boldsymbol y} - {\boldsymbol \mu}\right)
}{\partial {\boldsymbol \mu}} \right) \nonumber \\
&= -\frac{1}{2}
\left( -2 {\boldsymbol \Sigma}^{-1} \left( {\boldsymbol y} - {\boldsymbol \mu}\right) \right) \nonumber \\
&= {\boldsymbol \Sigma}^{-1} \left( {\boldsymbol y} - {\boldsymbol \mu} \right)
\end{align}$$
I'll leave it to you to differentiate this again and find the answer to be $-{\boldsymbol \Sigma}^{-1}$.
As "extra credit", use equations (57) and (61) to find that the gradient with respect to ${\boldsymbol \Sigma}$ is
$$
\begin{align}
\frac{ \partial {\bf L} }{ \partial {\boldsymbol \Sigma}}
&= -\frac{1}{2} \left( \frac{ \partial \log(|{\boldsymbol \Sigma}|)}{\partial{\boldsymbol \Sigma}}
+ \frac{\partial \left( {\boldsymbol y} - {\boldsymbol \mu}\right)'
{\boldsymbol \Sigma}^{-1} \left( {\boldsymbol y}- {\boldsymbol \mu}\right)
}{\partial {\boldsymbol \Sigma}} \right)\\
&= -\frac{1}{2} \left( {\boldsymbol \Sigma}^{-1} -
{\boldsymbol \Sigma}^{-1}
\left( {\boldsymbol y} - {\boldsymbol \mu} \right)
\left( {\boldsymbol y} - {\boldsymbol \mu} \right)'
{\boldsymbol \Sigma}^{-1} \right)
\end{align}
$$
I've left out a lot of the steps, but I made this derivation using only the identities found in the matrix cookbook, so I'll leave it to you to fill in the gaps.
I've used these score equations for maximum likelihood estimation, so I know they are correct :)
Since the Gaussian copula results from taking a multivariate normal and transforming the margins to uniformity, a multivariate distribution with Gaussian copula and normal margins is multivariate normal.
Transforming the margins to normality merely undoes the original transform to uniform margins to obtain the copula.
See the second sentence at the Gaussian copula section of the Wikipedia article on Copulas for confirmation.
Best Answer
I have found the answer to this.
There is a result,
$\frac{\partial}{\partial\rho_{ij}}f(x;0,\Sigma)=\frac{\partial^2}{\partial x_i\partial x_j}f(x;0,\Sigma)$ ("A reduction formula for normal multivariate integrals", Plackett 1954).
So using this result by exchanging the integral and derivative, we just need to be able to differentiate a normal CDF with respect to two of the variables. We can do this by first applying the fundamental theorem of calculus, then conditioning on the two variables,
$\frac{\partial}{\partial\rho_{ij}}F(x;0,\Sigma)=f(x_i,x_j;0,\Sigma_{ij})F(x-\{x_i,x_j\}\ |\ x_i,x_j)$