With usual notation, decompose $X$ as $X=X^+ - X^-$ (also note that $|X|=X^+ + X^-$). $X$ is said to have finite expectation (or to be integrable) if both ${\rm E}(X^+)$ and ${\rm E}(X^-)$ are finite. In this case ${\rm E}(X) = {\rm E}(X^+) - {\rm E}(X^-)$. Moreover, if ${\rm E}(X^+) = +\infty$ (respectively, ${\rm E}(X^-) = +\infty$) and ${\rm E}(X^-)<\infty$ (respectively, ${\rm E}(X^+)<\infty$), then ${\rm E}(X) = +\infty$ (respectively, ${\rm E}(X) = -\infty$). So, $X$ is allowed to have infinite expectation.
Whenever ${\rm E}(X)$ exists (finite or infinite), the strong law of large numbers holds. That is, if $X_1,X_2,\ldots$ is a sequence of i.i.d. random variables with finite or infinite expectation, letting $S_n = X_1+\cdots + X_n$, it holds $n^{-1}S_n \to {\rm E}(X_1)$ almost surely. The infinite expectation case follows from the finite case by the monotone convergence theorem.
If, on the other hand, ${\rm E}(X^+) = +\infty $ and ${\rm E}(X^-) = +\infty $, then $X$ does not admit an expectation.
In this case, must of the following must occur (a result by Kesten, see Theorem 1 in the paper The strong law of large numbers when the mean is undefined, by K. Bruce Erickson):
1) Almost surely, $n^{-1}S_n \to +\infty$; 2) Almost surely, $n^{-1}S_n \to -\infty$; 3) Almost surely, $\lim \sup n^{ - 1} S_n = + \infty$ and $\lim \inf n^{ - 1} S_n = - \infty$.
EDIT: Since you mentioned the recent post "Are there any random variables so that ${\rm E}[X]$ and ${\rm E}[Y]$ exist but ${\rm E}[XY]$ doesn't?", it is worth stressing the difference between "$X$ has expectation" and "$X$ is integrable".
By definition, $X$ is integrable if $|X|$ has finite expectation (recall that $|X|=X^+ + X^-$). So, for example, the random variable $X=1/U$, where $U \sim {\rm uniform}(0,1)$, is not integrable, yet has (infinite) expectation (indeed, $\int_0^1 {x^{ - 1} \,{\rm d}x} = \infty $). Further, it is worth noting the following. A random variable $X$ is integrable (i.e., ${\rm E}|X|<\infty$) if and only if
$$
\int_\Omega {|X|\,{\rm dP}} = \int_{ - \infty }^\infty {|x|\,{\rm d}F(x)} < \infty .
$$
A random variable has expectation if and only if
$$
\int_\Omega {X^ + \,{\rm dP}} = \int_{ - \infty }^\infty {\max \{ x,0\} \,{\rm d}F(x)} = \int_0^\infty {x\,{\rm d}F(x)} < \infty
$$
or
$$
\int_\Omega {X^ - \,{\rm dP}} = \int_{ - \infty }^\infty {-\min \{ x,0\} \,{\rm d}F(x)} = \int_{ - \infty }^0 {|x|\,{\rm d}F(x)} < \infty.
$$
In any of these cases, the expectation of $X$ is given by
$$
{\rm E}(X) = \int_0^\infty {x\,{\rm d}F(x)} - \int_{ - \infty }^0 {|x|\,{\rm d}F(x)} \in [-\infty,\infty].
$$
Finally, $X$ does not admit an expectation if and only if both $\int_\Omega {X^ + \,{\rm dP}} = \int_0^\infty {x\,{\rm d}F(x)}$ and $\int_\Omega {X^ - \,{\rm dP}} = \int_{ - \infty }^0 {|x|\,{\rm d}F(x)} $ are infinite. Thus, for example, a Cauchy random variable with density function $f(x) = \frac{1}{{\pi (1 + x^2 )}}$, $x \in \mathbb{R}$, though symmetric, does not admit an expectation, since both $\int_0^\infty {xf(x)\,{\rm d}x}$ and $\int_{ - \infty }^0 {|x|f(x)\,{\rm d}x}$ are infinite.
Best Answer
A random variable $X$ is integrable if and only if $$\mathbb E[|X|] = \int_\Omega |X(\omega)|\mathsf d\mathbb P(\omega) < \infty. $$ If $X\geqslant 0$ almost surely, i.e. $\mathbb P(X<0) = 0$, then this is equivalent to $\mathbb E[X]<\infty$. So in this case, "integrable" just means that it has a finite mean. However, the inequality would still hold even if $\mathbb E[X]=\infty$, since clearly $\mathbb P(X\leqslant a)<\infty$.