I don't have a combinatorial meaning, but you can think of it as follows.
$(X,Y)$ is the result of applying an affine transformation to a pair
$(W,Z)$ of independent standard normal random variables. Many such
transformations exist, and one in particular is
$$\begin{align*}
X &= \mu_x + \sigma_x W\\
Y &= \mu_y + \rho \sigma_y W + \sqrt{1-\rho^2} \sigma_y Z
\end{align*}$$
See for example
this set of slides. The contours of the joint density (points at
equal height above the $x$-$y$ plane) are ellipses centered at $(\mu_x,\mu_y)$.
The multivariate normal distribution of a vector $x\in \mathbf{R}^n$ is defined as $$f(x) = \frac{1}{(2\pi)^n |\Sigma|} e^{-1/2 \cdot (x-\mu)^T\Sigma^{-1}(x-\mu)}$$
where $\Sigma\in \mathbf{R}^{n\times n}$ is the covariance matrix where $\Sigma_{ij} = \mathbf{Cov}(x_i,x_j)$ (note the special case when $i=j$ is just $\mathbf{var}(x_i)$), $|\Sigma|$ is the determinant of the matrix, and $\mu\in\mathbf{R}^n$ is the mean.
With all this out of the way, the answer to your second question about how to represent the distribution in $XY$ rather than $UV$ is already answered by the definition of the normal distribution. Specifically,
$$\mu = [\mu_x, \mu_y]^T$$
$$\Sigma = \begin{bmatrix}\sigma_x^2 & \sigma_{xy}^2\\ \sigma_{yx}^2 & \sigma_y^2\end{bmatrix}$$
$$|\Sigma| = \sigma_x^2\sigma_y^2 - \sigma_{xy}^2 \cdot \sigma_{yx}^2$$
Plug in $x = [X,Y]^T$ and that is distribution.
For your first question, notice that we can relate $UV$ and $XY$ by a linear (technically affine) transformation
$$\begin{bmatrix}
U\\
V
\end{bmatrix} =
\begin{bmatrix}
1/\sigma_x & 0\\
0 & 1/\sigma_y
\end{bmatrix}
\begin{bmatrix}
X\\
Y
\end{bmatrix}
+ \begin{bmatrix}
-\mu_x /\sigma_x\\
-\mu_y / \sigma_y
\end{bmatrix}
$$
More succinctly, we have $u = Ax + b$. It just so happens that a linear combination (plus a possible constant) of Gaussian random variables, is in fact Gaussian (this is not obvious). And since we know the Gaussian PDF is defined by the two parameters $\Sigma$ and $\mu$, if we can find those, then we are done.
It should be clear from linearity of expectation that the mean of $u\in \mathbf{R}^2$ is just $A\mu +b$. For the covariance we have the identity $\mathbf{cov}(Az + b) = A\mathbf{cov}(z)A^T$ (this also more or less falls out from linearity of expectation and you can find numerous proofs online)
Doing everything out we find that the distribution of $UV$ is Gaussian with mean vector
$$\mu_{UV} =
\begin{bmatrix}
1/\sigma_x & 0\\
0 & 1/\sigma_y
\end{bmatrix}
\begin{bmatrix}
\mu_x\\
\mu_y
\end{bmatrix} +
\begin{bmatrix}
-\mu_x /\sigma_x\\
-\mu_y / \sigma_y
\end{bmatrix} =
\begin{bmatrix}
0\\
0
\end{bmatrix}
$$
$$\Sigma_{UV} =
\begin{bmatrix}
1/\sigma_x & 0\\
0 & 1/\sigma_y
\end{bmatrix}
\begin{bmatrix}\sigma_x^2 & \sigma_{xy}^2\\
\sigma_{yx}^2 & \sigma_y^2
\end{bmatrix}
\begin{bmatrix}
1/\sigma_x & 0\\
0 & 1/\sigma_y
\end{bmatrix} =
\begin{bmatrix}
1 & \rho \\
\rho & 1
\end{bmatrix}
$$
You can go ahead and compute the inverse and determinant and see that you get the joint PDF you have described
Best Answer
Let $X$ be normal, mean $\mu$, variance $\sigma^2$, and let $Y$ be normal, mean $\nu$, variance $\tau^2$.
Since $X$ and $Y$ are independent, $aX+bY$ is normal, mean $a\mu+b\nu$, and variance $a^2\mu+b^2\nu$. This fact is quite useful, and will help you ompute the moment-generating function $M_Z(t)$ of $Z$.
Now we find a formula for the moment generating function of $X$. We want to find the expectation of $e^{tX}$ (we are computing this one, instead of doing the essentially identical calculation of $E(e^{tY})$, because of a preference for the letter $X$.) We have $$E(e^{tX})= \int_{-\infty}^{\infty}\frac{1}{\sqrt{2\pi}\sigma}e^{tx}e^{-\frac{(x-\mu)^2}{2\sigma^2}}dx.$$ It remains to evaluate this integral. We do the calculation in two steps, though it could be done in one. Make the substitution $w=\dfrac{x-\mu}{\sigma}$. Routine use of the substitution process shows that our integral is $$\int_{-\infty}^{\infty}\frac{1}{\sqrt{2\pi}}e^{t(\mu+\sigma w)}e^{-\frac{w^2}{2}}dw.$$ We can simplify this to $$e^{t\mu}\int_{-\infty}^{\infty}\frac{1}{\sqrt{2\pi}}e^{-\frac{w^2}{2} +t\sigma w}dw.$$ Look at the exponent in the above integral. It is $-\frac{w^2}{2}+t\sigma w$, which is $-\frac{1}{2}(w^2-2t\sigma w)$. Completing the square, we find that the exponent is $$-\frac{1}{2}(w-t\sigma)^2+\frac{t^2\sigma^2}{2}.$$ We can bring the
$e^{\frac{t^2\sigma^2}{2}}$ part "out" of the integral, and find that our integral is $$e^{t\mu+\frac{t^2\sigma^2}{2}}\int_{-\infty}^{\infty}\frac{1}{\sqrt{2\pi}}e^{-\frac{(w-t\sigma)^2}{2}}dw.$$ It's almost over! Make the substitution $z=w-t\sigma$. We get that our integral is $$e^{t\mu+\frac{t^2\sigma^2}{2}}\int_{-\infty}^{\infty}\frac{1}{\sqrt{2\pi}}e^{-\frac{z^2}{2}}dz.$$ The remaining integral has value $1$, since it is the area under the (standard) normal curve. We conclude that $$E(e^{tX})=e^{t\mu+\frac{t^2\sigma^2}{2}}.$$
You will not have to sweat like this to find $E(e^{tZ})$. You can probably just borrow the result we have just proved, and replace $\mu$ by $a\mu+b\nu$, and replace $\sigma^2$ by $a^2\sigma^2+b^2\tau^2$.
But maybe not. It all depends on whether it has been proved, or you can take for granted, that a linear combination of independent normals is normal.