NB: Throughout what follows, we denote the standard normal density function by $\varphi(x) = (2\pi)^{-1/2} \exp(-\frac{1}{2} x^2)$ and its cumulative distribution function by $\Phi(x)$.
Method 1: Avoid calculus.
This is a consequence of a much more general result and, in fact, has nothing in particular to do with the normal distribution at all. Here is a slightly simplified version.
Let $F$ be a strictly increasing distribution function on some interval $(a,b)$ such that $F(a) = 0$ and $F(b) = 1$. We allow $a = -\infty$ and $b=\infty$, so that we can handle the case where $F$ is defined on the entire real line, as in your example.
Suppose $X$ is a random variable with distribution function $F$. Then, $Y = F(X)$ has a uniform distribution on $(0,1)$. The proof is simple. For $y \in (0,1)$,
$$
\renewcommand{\Pr}{\mathbb{P}}
\Pr(Y \leq y) = \Pr( F(X) \leq y) = \Pr(X \leq F^{-1}(y)) = F( F^{-1}(y) ) = y \> ,
$$
where the inverse $F^{-1}$ exists by the hypothesis that $F$ is strictly increasing on $(a,b)$.
Hence, $Y$ is distributed uniformly on $(0,1)$ and, as a consequence, $\newcommand{\e}{\mathbb{E}}\e Y = 1/2$.
Note that for your particular case, you start with $X \sim \mathcal{N}(a, \sigma^2)$ and so $Y = \Phi((X-a)/\sigma)$. Your problem can easily be seen to be equivalent to asking for $\e (1 - Y) = \e Y = 1/2$.
Method 2: Hammer and tongs (i.e., use calculus).
Note that your integral can be written as
$$
\int_{-\infty}^\infty \int_{-\infty}^{- (x-a)/\sigma} \varphi(y) \frac{1}{\sigma} \varphi((x-a)/\sigma) \newcommand{\rd}{\mathrm{d}} \,\rd y \, \rd x = \int_{-\infty}^\infty \int_{-\infty}^{\sigma y + a} \varphi(y) \frac{1}{\sigma} \varphi((x-a)/\sigma) \,\rd x \, \rd y \> .
$$
Now, change variables using $u = (x-a)/\sigma$, which gives the integral
$$
\int_{-\infty}^\infty \int_{-\infty}^y \varphi(u) \varphi(y) \, \rd u \,\rd y \>.
$$
Exchanging the order of the iterated integrals, we get
$$
\int_{-\infty}^\infty \int_{-\infty}^y \varphi(u) \varphi(y) \, \rd u \,\rd y = \int_{-\infty}^\infty \int_u^\infty \varphi(u) \varphi(y) \, \rd y \,\rd u \>.
$$
But, since $\varphi$ is a probability density function
$$
\int_{-\infty}^\infty \int_{-\infty}^y \varphi(u) \varphi(y) \, \rd u \,\rd y + \int_{-\infty}^\infty \int_y^\infty \varphi(u) \varphi(y) \, \rd u \,\rd y = 1 \>,
$$
and so, by symmetry, the integral must be $1/2$.
(All exchanges of order of integration are valid by Fubini's theorem.)
Best Answer
For any $a,b,\theta \in \mathbb{R}$ and $c > 0$, it holds $$ F(\theta ) := \int_{ - \infty }^\infty {\Phi (\theta - a - bx)\phi (cx - \theta )\,{\rm d}x} = \frac{1}{c}\Phi \bigg(\frac{{(c - b)\theta - ac}}{{\sqrt {c^2 + b^2 } }}\bigg). $$ (I confirmed the result numerically.)
Proof. First note that $$ \frac{1}{c}\Phi \bigg(\frac{{(c - b)\theta - ac}}{{\sqrt {c^2 + b^2 } }}\bigg) = \frac{1}{c}{\rm P}\bigg[Z \le \frac{{(c - b)\theta - ac}}{{\sqrt {c^2 + b^2 } }}\bigg] = \frac{1}{c}{\rm P}\big[cX + bY \le (c - b)\theta - ac\big], $$ where $X$, $Y$, and $Z$ are independent ${\rm N}(0,1)$ random variables (note that $\sqrt {c^2 + b^2 } Z$ and $cX+bY$ are identically distributed). By the law of total probability, conditioning on $Y$, we thus get $$ \frac{1}{c}\Phi \bigg(\frac{{(c - b)\theta - ac}}{{\sqrt {c^2 + b^2 } }}\bigg) = \frac{1}{c}\int_{ - \infty }^\infty {{\rm P}\big[cX + by \le (c - b)\theta - ac\big]\phi (y)\,{\rm d}y}. $$ A change of variable $y=cx-\theta$ then gives $$ \frac{1}{c}\Phi \bigg(\frac{{(c - b)\theta - ac}}{{\sqrt {c^2 + b^2 } }}\bigg) = \frac{1}{c}\int_{ - \infty }^\infty {{\rm P}\big[cX + b(cx - \theta ) \le (c - b)\theta - ac\big]\phi (cx - \theta )c\,{\rm d}x} . $$ A little algebra shows that the expression on the right is equal to $$ \int_{ - \infty }^\infty {{\rm P}\big[X \le \theta - a -bx \big]\phi (cx - \theta )\,{\rm d}x}, $$ and hence $$ \frac{1}{c}\Phi \bigg(\frac{{(c - b)\theta - ac}}{{\sqrt {c^2 + b^2 } }}\bigg) = \int_{ - \infty }^\infty {\Phi (\theta - a - bx)\phi (cx - \theta )\,{\rm d}x}. $$