As described below, this question is very closely related to the question Meaning of non-existence of expectation?.
There is a difference between existence of integral and integrability in Lebesgue (but not Riemann) integration.
A measurable function $f$ (taking values in $[-\infty,\infty]$, on some measure space with measure $\mu$) is said to be Lebesgue integrable if $\int {|f|d\mu } < \infty $. As usual, let $f^+$ and $f^-$ be the (nonnegative) functions defined by $f^ + (x) = \max \{ f(x),0\}$ and $f^ - (x) = -\min \{ f(x),0\}$, so that $|f|=f^+ + f^-$. Then, $f$ is integrable if and only if $\int {f^ + d\mu } < \infty$ and $\int {f^ - d\mu } < \infty$; in this case,
$$
\int {fd\mu } : = \int {f^ + d\mu } - \int {f^ - d\mu }
$$
(in particular, the integral of an integrable function is finite). However, $f$ need not be integrable in order that the integral $\int {fd\mu }$ be defined. Indeed, the definition of $\int {fd\mu }$ for nonnegative measurable $f$ allows it to take the value $\infty$.
With the convention that $\infty - c = \infty$ and $c - \infty = -\infty$, for $c$ finite, the above definition of $\int {fd\mu }$ holds whenever $\int {f^ + d\mu } \leq \infty$ and $\int {f^ - d\mu } < \infty$ or $\int {f^ + d\mu } < \infty$ and $\int {f^ - d\mu } \leq \infty$. The integral $\int {fd\mu }$ does not exist if and only if $\int {f^ + d\mu } = \infty$ and $\int {f^ - d\mu } = \infty$ (since $\infty - \infty$ is not defined).
Examples: $\int_{(0,1]} {x^{ - 1} dx} = \infty$ (where $dx$ stands for Lebesgue measure), so the integral exists but $x^{-1}$ is not integrable on $(0,1]$. On the other hand, the integral $\int_{(0,1]} {\frac{{\sin (1/x)}}{x}dx}$ does not exist, since $\int_{(0,1]} {\big[\frac{{\sin (1/x)}}{x}\big]^ + }dx = \infty $ and $\int_{(0,1]} {\big[\frac{{\sin (1/x)}}{x}\big]^ - }dx = \infty $ (in particular, $\frac{{\sin (1/x)}}{x}$ is not integrable on $(0,1]$).
Relation to expectations of random variables.
Let $(\Omega,\mathcal{F},P)$ be a probability space, that is, a measure space with $P(\Omega)=1$.
A random variable is a measurable function $X:\Omega \to \mathbb{R}$. So, $X(\omega)$ ($\omega \in \Omega$) and $P$ play the same role as $f(x)$ and $\mu$ above, respectively. The expectation of $X$ is defined by
${\rm E}(X) = \int_\Omega {XdP} $, which is just a special case of $\int {fd\mu }$ above. Accordingly, $X$ is said to be integrable if ${\rm E}|X|:=\int_\Omega {|X|dP} < \infty$, and has expectation if ${\rm E}(X^+) :=
\int_\Omega {X^ + dP} \le \infty $ and ${\rm E}(X^-) :=
\int_\Omega {X^ - dP} < \infty $ or ${\rm E}(X^+) < \infty $ and ${\rm E}(X^-) \leq \infty $ (in either case, ${\rm E}(X)= {\rm E}(X^+) - {\rm E}(X^-)$); $X$ does not admit an expectation if and only if ${\rm E}(X^+) = \infty$ and ${\rm E}(X^-) = \infty$.
It is instructive to consider the above examples in the setting of expectations. Let the probability space $(\Omega,\mathcal{F},P)$ be defined by $\Omega = (0,1]$, $\mathcal{F} = \mathcal{B}((0,1])$, and $P$ Lebesgue measure on $(0,1]$ (thus $dP(\omega)=d\omega$). Note that the random variable $X$ defined by $X(\omega)=\omega$ is a uniform$(0,1]$ random variable. Define $X_1$ by $X_1 = 1/X$, that is $X_1 (\omega) = 1/\omega$. Then, ${\rm E}(X_1) =
\int_{(0,1]} {\frac{1}{\omega }d\omega } = \infty$; so, $X_1$ has (infinite) expectation but is not integrable. On the other hand, the random variable $X_2$ defined by $X_2 = \frac{{\sin (1/X)}}{X}$, that is $X_2 (\omega) = \frac{{\sin (1/\omega)}}{\omega}$, does not admit an expectation, since the corresponding integral, $\int_{(0,1]} {\frac{{\sin (1/\omega )}}{\omega }d\omega }$, does not exist (as noted above).
Finally, it is interesting to consider the above examples with connection to the strong law of large numbers (SLLN). Let $X_1^1,X_2^1,\ldots$ be a sequence of i.i.d. random variables distributed as $X_1$, and let $S_n^1 = X_1^1 + \cdots + X_n^1$. Then, almost surely, $n^{-1}S_n^1 \to {\rm E}(X_1) = \infty$ (this follows from the standard SLLN by using the monotone convergence theorem). So, the existence of ${\rm E}(X_1) :=
\int_{(0,1]} {\frac{1}{\omega }d\omega } = \infty$ agrees with SLLN. As for the second example, let $X_1^2,X_2^2,\ldots$ be a sequence of i.i.d. random variables distributed as $X_2$, and let $S_n^2 = X_1^2 + \cdots + X_n^2$. Since $X_2$ does not admit an expectation, with $X_2 ^+$ and $X_2^-$ behaving the same, one may expect that, almost surely, $\lim \sup n^{ - 1} S_n^2 = \infty $ and $\lim \inf n^{ - 1} S_n^2 = -\infty$; see Theorem 1 in the paper The strong law of large numbers when the mean is undefined, by K. Bruce Erickson. Here, it is important to note that $\frac{{\sin (1/x)}}{x}$ is (improperly) Riemann integrable on $(0,1]$; so SLLN may justify the Lebesgue non-integrability of this function.
Best Answer
If $f$ is continuous, then $\int_a^b {f(x)dF(x)} = \int_a^b {f(x)d\mu (x)} $, for any $-\infty < a < b < \infty$, where $F$ and $\mu$ are the distribution function and probability distribution of $X$, respectively (they are related by $\mu((s,t])=F(t)-F(s)$, for any $-\infty < s < t < \infty$).
A drawback of the Riemann-Stieltjes integral is illustrated in the following simple example. Suppose that $X=0$ almost surely. Then, $\int {F(x)dF(x)} $ is not defined, whereas $\int {F(x)d\mu (x)} = F(0) = 1$. Of course, ${\rm E}[F(X)] = {\rm E}[F(0)]= 1$.
Another (more significant) drawback is indicated in GWu's answer.