You are almost there,
follow your last step:
$$E[X] = \frac{1}{\sqrt{2\pi}}\int_{-\infty}^{\infty} xe^{\displaystyle\frac{-x^{2}}{2}}\mathrm{d}x\\=-\frac{1}{\sqrt{2\pi}}\int_{-\infty}^{\infty}e^{-x^2/2}d(-\frac{x^2}{2})\\=-\frac{1}{\sqrt{2\pi}}e^{-x^2/2}\mid_{-\infty}^{\infty}\\=0$$.
Or you can directly use the fact that $xe^{-x^2/2}$ is an odd function and the limits of the integral are symmetric about $x=0$.
When $F$ is the CDF of a random variable $X$ and $g$ is a (measurable) function, the expectation of $g(X)$ can be found as a Riemann-Stieltjes integral
$$\mathbb{E}(g(X)) = \int_{-\infty}^\infty g(x) dF(x).$$
This expresses the Law of the Unconscious Statistician.
If $g$ is also differentiable, write $dF = -d(1-F)$ and integrate by parts to give
$$\mathbb{E}(g(X)) = -g(x)(1-F(x)){\big|}_{-\infty}^\infty + \int_{-\infty}^\infty (1-F(x)) g^\prime(x)\, \text{d}x$$
provided both addends converge. This means several things, which may be simply expressed by breaking the integral at some definite finite value such as $0$:
${\lim}_{x\to -\infty} g(x)(1-F(x))$ and ${\lim}_{x\to \infty} g(x)(1-F(x))$ exist and are finite. If so, the first addend is the difference of these two.
$\lim_{t\to -\infty} \int_t^0 (1-F(x))g^\prime(x)\,\text{d}x$ and $\lim_{t\to \infty} \int_0^t (1-F(x))g^\prime(x)\,\text{d}x$ exist and are finite. If so, the second addend is the sum of these two.
A good place to break the integral is at any zero of $g$, because--provided $g$ eventually decreases fast enough for large $|x|$--that causes the first addend to vanish, leaving only the integral of $g^\prime$ against the survival function $1-F$.
Example
The expectation of a non-negative variable $X$ is obtained by applying the formula to the identity function $g(x)=x$ for which $g^\prime(x)=1$ and utilizing the fact that the integration may begin at zero:
$$\mathbb{E}(X) = -x(1-F(x))\big|_{0}^\infty + \int_{0}^\infty (1-F(x))\,\text{d}x.$$
Provided $\lim_{x\to\infty} x (1-F(x)) = 0$ (that is, the survival function does not have an overly heavy tail), the upper limit of the first term vanishes. Its lower limit obviously vanishes. We are left only with the integral, giving the expression in the question.
Best Answer
I would like to add a thing to the answer by @Thomas Lumley
One can come up with the following:
$$\begin{align} E[\max(X,a)]&=P(X\geq a)\cdot E[\max(X,a)|X\geq a]+P(X<a)\cdot E[\max(X,a)|X<a]\\ &=P(X\geq a)\cdot E[X|X\geq a]+P(X<a)\cdot a\\ &=P(X\geq a)\cdot E[X|X\geq a]+(1-P(X\geq a))\cdot a\\ &=P(X\geq a)\cdot (E[X|X\geq a]-a)+a \end{align}$$
Combining this with the previous answer, we get:
$$\int_a^\infty(1-F(x))dx=E[\max(X,a)]-a=E[\max(X-a,0)]$$
Edit: As @Ben added in his comment, it's with noting that in the special case $a=0$, you recover the usual expected value rule for non-negative random variables:
$$\int_0^\infty(1-F(x))dx=E[\max(X,0)]=E[X]$$