$\DeclareMathOperator{\Var}{Var}$
$\DeclareMathOperator{\Cov}{Cov}$
$\DeclareMathOperator{\tr}{tr}$
The variance of $z$ can be computed as follows (suppose $A$ is symmetric, which is standard when the quadratic form is discussed. For the non-symmetric case, rewrite $z$ as $z = x'Bx + b'x$, where $B = (A + A')/2$ and replace $A$ below with the symmetric matrix $B$).
In this thread, it is shown that when $x \sim N(\mu, \Sigma)$ and $A$ is symmetric,
\begin{align}
\Var(x'Ax) = 2\tr((A\Sigma)^2) + 4\mu'A\Sigma A\mu.
\end{align}
Therefore, to evaluate
\begin{align}
\Var(z) = \Var(x'Ax + b'x) = \Var(x'Ax) + \Var(b'x) + 2\Cov(x'Ax, b'x), \tag{1}
\end{align}
it suffices to evaluate
\begin{align}
\Cov(x'Ax, b'x) = E[x'Axb'x] - E[x'Ax]E[b'x].
\end{align}
Since $E[x'Ax] = \mu'A\mu + \tr(A\Sigma), E[b'x] = b'\mu$, the only difficult part left is $E[x'Axb'x]$. To this end, write $x = \mu + y$, where $y \sim N(0, \Sigma)$, then
\begin{align}
& x'Axb'x = (\mu + y)'A(\mu + y)b'(\mu + y) \\
=& (\mu'A\mu + 2\mu'Ay + y'Ay)(b'\mu + b'y) \\
=& \mu'A\mu b'\mu + \mu'A\mu b'y + 2b'\mu\mu'Ay + 2\mu'Ayb'y + b'\mu y'Ay + y'Ayb'y
\end{align}
In my answer to this thread, it is argued that for $y \sim N(0, \Sigma)$, we have
\begin{align}
E[y'Ayb'y] = 0.
\end{align}
In addition,
\begin{align}
& E[\mu'Ayb'y] = E[y'b\mu'Ay] = \tr(b\mu'A\Sigma) = \mu'A\Sigma b, \\
& E[b'\mu y'Ay] = b'\mu\tr(A\Sigma).
\end{align}
It thus follows that
\begin{align}
& E[x'Axb'x] = \mu'A\mu b'\mu + 2\mu'A\Sigma b + b'\mu\tr(A\Sigma), \\
& \Cov(x'Ax, b'x) = \mu'A\mu b'\mu + 2\mu'A\Sigma b + b'\mu\tr(A\Sigma) -
(\mu'A\mu + \tr(A\Sigma))b'\mu \\
&\phantom{\Cov(x'Ax, b'x)} = 2\mu'A\Sigma b.
\end{align}
Substituting
\begin{align}
& \Var(x'Ax) = 2\tr((A\Sigma)^2) + 4\mu'A\Sigma A\mu, \\
& \Var(b'x) = b'\Sigma b, \\
& \Cov(x'Ax, b'x) = 2\mu'A\Sigma b
\end{align}
into $(1)$, we get
\begin{align}
\Var(z) = 2\tr((A\Sigma)^2) + 4\mu'A\Sigma A\mu + b'\Sigma b + 4\mu'A\Sigma b.
\end{align}
Best Answer
One important question is whether you want a metric that says, for example, that a Binomial(0.5, 20) distribution is close to a Normal (because obviously) or far from a Normal (because it's discrete).
Another question is whether you want a metric for all distributions or just one for empirical distributions in datasets (so everything is discrete and has probability masses that are multiples of $1/n$.
Yet another question is how much you care about ease of computation, and in how many dimensions.
For a metric on all distributions that says the Normal and Binomial are close, you want something that metrizes convergence in distribution. The total variation distance or the Hellinger distance would be good.
For a metric on all distributions that says the Normal and Binomial are not close, you want something based on the likelihood ratio. If it doesn't have to be literally a metric in the topological sense, the Kullback-Leibler divergence would do; if it does, you could use the symmetrised Kullback-Leibler divergence
For something that's easy to compute on data sets (but hard to work with mathematically) you could use a nearest-neighbour distance such as $\max_y\min_x d(x,y)$ for the distance between a point in one data set and its nearest neighbour in the other (that's not symmetric, you can add the same thing with $x$ and $y$ switched to make it symmetric). Or replace the $\max_y$ by a mean over $y$ or something.