NB: Throughout what follows, we denote the standard normal density function by $\varphi(x) = (2\pi)^{-1/2} \exp(-\frac{1}{2} x^2)$ and its cumulative distribution function by $\Phi(x)$.
Method 1: Avoid calculus.
This is a consequence of a much more general result and, in fact, has nothing in particular to do with the normal distribution at all. Here is a slightly simplified version.
Let $F$ be a strictly increasing distribution function on some interval $(a,b)$ such that $F(a) = 0$ and $F(b) = 1$. We allow $a = -\infty$ and $b=\infty$, so that we can handle the case where $F$ is defined on the entire real line, as in your example.
Suppose $X$ is a random variable with distribution function $F$. Then, $Y = F(X)$ has a uniform distribution on $(0,1)$. The proof is simple. For $y \in (0,1)$,
$$
\renewcommand{\Pr}{\mathbb{P}}
\Pr(Y \leq y) = \Pr( F(X) \leq y) = \Pr(X \leq F^{-1}(y)) = F( F^{-1}(y) ) = y \> ,
$$
where the inverse $F^{-1}$ exists by the hypothesis that $F$ is strictly increasing on $(a,b)$.
Hence, $Y$ is distributed uniformly on $(0,1)$ and, as a consequence, $\newcommand{\e}{\mathbb{E}}\e Y = 1/2$.
Note that for your particular case, you start with $X \sim \mathcal{N}(a, \sigma^2)$ and so $Y = \Phi((X-a)/\sigma)$. Your problem can easily be seen to be equivalent to asking for $\e (1 - Y) = \e Y = 1/2$.
Method 2: Hammer and tongs (i.e., use calculus).
Note that your integral can be written as
$$
\int_{-\infty}^\infty \int_{-\infty}^{- (x-a)/\sigma} \varphi(y) \frac{1}{\sigma} \varphi((x-a)/\sigma) \newcommand{\rd}{\mathrm{d}} \,\rd y \, \rd x = \int_{-\infty}^\infty \int_{-\infty}^{\sigma y + a} \varphi(y) \frac{1}{\sigma} \varphi((x-a)/\sigma) \,\rd x \, \rd y \> .
$$
Now, change variables using $u = (x-a)/\sigma$, which gives the integral
$$
\int_{-\infty}^\infty \int_{-\infty}^y \varphi(u) \varphi(y) \, \rd u \,\rd y \>.
$$
Exchanging the order of the iterated integrals, we get
$$
\int_{-\infty}^\infty \int_{-\infty}^y \varphi(u) \varphi(y) \, \rd u \,\rd y = \int_{-\infty}^\infty \int_u^\infty \varphi(u) \varphi(y) \, \rd y \,\rd u \>.
$$
But, since $\varphi$ is a probability density function
$$
\int_{-\infty}^\infty \int_{-\infty}^y \varphi(u) \varphi(y) \, \rd u \,\rd y + \int_{-\infty}^\infty \int_y^\infty \varphi(u) \varphi(y) \, \rd u \,\rd y = 1 \>,
$$
and so, by symmetry, the integral must be $1/2$.
(All exchanges of order of integration are valid by Fubini's theorem.)
This question is known as (indeterminate) moment problem and has been first considered by Stieltjes and Hamburger. In general, the answer to your question is: No, distributions are not uniquely determined by their moments.
The standard counterexample is the following (see e.g. Rick Durrett, Probability: Theory and Examples): The lognormal distribution
$$p(x) := \frac{1}{x\sqrt{2\pi}} \exp \left(- \frac{(\log x)^2}{2} \right)$$
and the "perturbed" lognormal distribution
$$q(x) := p(x) (1+ \sin(2\pi \log(x))$$
have the same moments.
Much more interesting is the question under which additional assumptions the moments are determining. @StefanHansen already mentioned the existence of exponential moments, but obviously that's a strong condition. Some years ago Christian Berg showed that so-called Hankel matrices are strongly related to this problem; in fact one can show that the moment problem is determinante if and only if the smallest eigenvalue of the Hankel matrix converges to $0$. For a more detailed discussion see e.g. this introduction or Christian Berg's paper.
Best Answer
I don't know if that particularly helps (since it does not directly relate to the moments of distributions characterized by $F_1,F_2$, and relies on elementary calculations), but maybe it will foster further discussion. Let $f=\overline{F}_1$, $g=\overline{F}_2$ and $||f||^2=\int f^2(x)d x$ and assume that $F_1,F_2$ have the same mean, $||f||,||g||<\infty$. Then $G'-G\ge 0$ if and only if $$ 2\int (f^2-g^2)+\int(f-g)^2\ge 0. $$ The above shows immediatelly that if $||f||>||g||$ then $G'-G\ge 0$. Assume $||f||<||g||$ and denote $z=||g||/||f|| >1$. Further transofmations give another useful(?) iff condition: $$ ||f|| ||g|| \left( \frac{3}{z}-z-2\frac{<f,g>}{||f||g||}\right)\ge 0. $$ The term $\frac{<f,g>}{||f||g||}$ is the 'angle' or 'correlation' between $f$ and $g$ in $L^2$ (not to be confused with correlation between random variables with pdf's $F_1$ and $F_2$), hence takes values in $[-1,1]$. As a result $G'-G<0$, if $z>3$, and if $z\in(1,3)$ then it can go either way (depending on the assumed 'correlation').