With usual notation, decompose $X$ as $X=X^+ - X^-$ (also note that $|X|=X^+ + X^-$). $X$ is said to have finite expectation (or to be integrable) if both ${\rm E}(X^+)$ and ${\rm E}(X^-)$ are finite. In this case ${\rm E}(X) = {\rm E}(X^+) - {\rm E}(X^-)$. Moreover, if ${\rm E}(X^+) = +\infty$ (respectively, ${\rm E}(X^-) = +\infty$) and ${\rm E}(X^-)<\infty$ (respectively, ${\rm E}(X^+)<\infty$), then ${\rm E}(X) = +\infty$ (respectively, ${\rm E}(X) = -\infty$). So, $X$ is allowed to have infinite expectation.
Whenever ${\rm E}(X)$ exists (finite or infinite), the strong law of large numbers holds. That is, if $X_1,X_2,\ldots$ is a sequence of i.i.d. random variables with finite or infinite expectation, letting $S_n = X_1+\cdots + X_n$, it holds $n^{-1}S_n \to {\rm E}(X_1)$ almost surely. The infinite expectation case follows from the finite case by the monotone convergence theorem.
If, on the other hand, ${\rm E}(X^+) = +\infty $ and ${\rm E}(X^-) = +\infty $, then $X$ does not admit an expectation.
In this case, must of the following must occur (a result by Kesten, see Theorem 1 in the paper The strong law of large numbers when the mean is undefined, by K. Bruce Erickson):
1) Almost surely, $n^{-1}S_n \to +\infty$; 2) Almost surely, $n^{-1}S_n \to -\infty$; 3) Almost surely, $\lim \sup n^{ - 1} S_n = + \infty$ and $\lim \inf n^{ - 1} S_n = - \infty$.
EDIT: Since you mentioned the recent post "Are there any random variables so that ${\rm E}[X]$ and ${\rm E}[Y]$ exist but ${\rm E}[XY]$ doesn't?", it is worth stressing the difference between "$X$ has expectation" and "$X$ is integrable".
By definition, $X$ is integrable if $|X|$ has finite expectation (recall that $|X|=X^+ + X^-$). So, for example, the random variable $X=1/U$, where $U \sim {\rm uniform}(0,1)$, is not integrable, yet has (infinite) expectation (indeed, $\int_0^1 {x^{ - 1} \,{\rm d}x} = \infty $). Further, it is worth noting the following. A random variable $X$ is integrable (i.e., ${\rm E}|X|<\infty$) if and only if
$$
\int_\Omega {|X|\,{\rm dP}} = \int_{ - \infty }^\infty {|x|\,{\rm d}F(x)} < \infty .
$$
A random variable has expectation if and only if
$$
\int_\Omega {X^ + \,{\rm dP}} = \int_{ - \infty }^\infty {\max \{ x,0\} \,{\rm d}F(x)} = \int_0^\infty {x\,{\rm d}F(x)} < \infty
$$
or
$$
\int_\Omega {X^ - \,{\rm dP}} = \int_{ - \infty }^\infty {-\min \{ x,0\} \,{\rm d}F(x)} = \int_{ - \infty }^0 {|x|\,{\rm d}F(x)} < \infty.
$$
In any of these cases, the expectation of $X$ is given by
$$
{\rm E}(X) = \int_0^\infty {x\,{\rm d}F(x)} - \int_{ - \infty }^0 {|x|\,{\rm d}F(x)} \in [-\infty,\infty].
$$
Finally, $X$ does not admit an expectation if and only if both $\int_\Omega {X^ + \,{\rm dP}} = \int_0^\infty {x\,{\rm d}F(x)}$ and $\int_\Omega {X^ - \,{\rm dP}} = \int_{ - \infty }^0 {|x|\,{\rm d}F(x)} $ are infinite. Thus, for example, a Cauchy random variable with density function $f(x) = \frac{1}{{\pi (1 + x^2 )}}$, $x \in \mathbb{R}$, though symmetric, does not admit an expectation, since both $\int_0^\infty {xf(x)\,{\rm d}x}$ and $\int_{ - \infty }^0 {|x|f(x)\,{\rm d}x}$ are infinite.
The example you give actually is a $\sigma$-finite Borel measure. Equip $[0,1]$ with the cofinite topology (in which a set is open iff it is either empty or its complement is finite). Then your $\Sigma$ is the Borel $\sigma$-algebra of the cofinite topology (it is a nice exercise to verify this).
However, there is the following result:
Proposition. Let $(X,d)$ be a separable metric space, $\Sigma$ its Borel $\sigma$-algebra, and $\mu$ a $\sigma$-finite measure on $\Sigma$. Then each atom of $\mu$ is the union of a point mass and a null set.
Proof. Let $C$ be a countable dense subset of $X$. For each integer $k\in\mathbb{N}$, we have $\bigcup_{x \in C} B(x, 1/k) = X$. Thus $\bigcup_{x \in C} (A \cap B(x,1/k)) = A$. So by countable additivity, there exists $x_k \in C$ such that $\mu(A \cap B(x_k, 1/k)) > 0$. Since $A$ is an atom, $\mu(A \setminus B(x_k, 1/k)) = 0$. Let $S = \bigcap_k B(x_k, 1/k)$.
Since for each $k$, $S$ is contained in a ball of radius $1/k$, $S$ contains at most one point.
On the other hand, by De Morgan's law and countable additivity,
$$\mu(A \setminus S) = \mu\left(\bigcup_k A \setminus B(x_k, 1/k)\right) = 0.$$
Since $\mu(A\cap S)=\mu(A) > 0$, $A\cap S$ is not empty, so $A\cap S$ is a singleton.
Hence $A\cap S$ is a point mass and $A \setminus S$ a null set. $\Box$
So in this case, effectively the only atoms are point masses.
Note that we did not need to assume $X$ was complete.
For non-separable metric spaces, things are harder. For uncountable discrete spaces (which are certainly metric), the question of whether there can be nontrivial atoms is related to whether the cardinality of $X$ is a measurable cardinal, and such questions tend to be independent of the axioms of ZFC. I asked a new question about it: Consistency strength of 0-1 valued Borel measures.
Best Answer
Let $(\Omega,\Sigma)$ be a measurable space. An atom of $\Sigma$ is a set $B\in\Sigma$ such that for all $A\subseteq B$ either $A=\emptyset$ or $B=A$. A measurable space is atomic if every element lies in some atom. The $\sigma$-algebra $\Sigma$ is countably generated if there is a countable family of measurable sets such that $\Sigma$ is the smallest $\sigma$-algebra containing all of them. For example $(\mathbb{R},\mathcal{B})$ is countably generated since $\mathcal{B}$ is generated by the open intervals with rational endpoints. The atoms of $\mathcal{B}$ are the singletons.
Proposition: If $\Sigma$ is countably generated, then $(X,\Sigma)$ is atomic.
Proof: If there is a countable family generating $\Sigma$, there is also a countable family closed under complementation that generates $\Sigma$. If $\mathcal{C}$ is such a family, we get all atoms of $\Sigma$ as the intersection of all elements of $\mathcal{C}$ that contain a given point.
Now if $(\Omega,\Sigma,\mu)$ is a probability space, we call $B\in\Sigma$ a $\mu$-atom if $\mu(B)>0$ and for all $A\in\Sigma$ such that $A\subseteq B$, either $\mu(A)=0$ or $\mu(A)=\mu(B)$. The probability space is atomless if it contains no $\mu$-atom.
Lemma: If $(\Omega,\Sigma,\mu)$ is a probability space such that $\Sigma$ is countably generated and $\mu$ takes on only the values $0$ and $1$, then there exists an atom $A\in\Sigma$ such that $\mu(A)=1$.
Proof: Let $\mathcal{C}$ be a countable family closed under complementation that generates $\Sigma$. For each element of $\mathcal{C}$, either itself or its complement has probability one $1$. The intersection of all elements in $\Sigma$ with probability $1$ is an atom with probability $1$.
Proposition: If $(\Omega,\Sigma,\mu)$ is a probability space with $\Sigma$ countably generated, then it is atomless if and only if every atom in $\Sigma$ has probability $0$.
Proof: Clearly, in an atomless probability space, every atom must have probability $0$. Supppose now that $A$ is a $\mu$-atom. Let $A\cap\Sigma=\{A\cap S:S\in\Sigma\}$ be the trace $\sigma$-algebra. It is countably generated too. Then $(A,A\cap\Sigma,1/\mu(A)\cdot\mu)$ is a probability space such that the probability takes on only the values $0$ and $1$. So by the lemma, there is an atom $B$ such that $1/\mu(A)⋅\mu(B)=1$. But $B$ is also an atom of $\Sigma$ and $\mu(B)>0$.
So it follows that a probability measure on $(\mathbb{R},\mathcal{B})$ is atomless if and only if it puts probability $0$ on all singletons, which justifies the definition in the book of Kai Lai Chung.
Finally, an example of a probability space in which each atom has probability $0$ but such that the space is not atomless. Let $\Omega$ be any uncountable set, let $\Sigma$ consists of those subsets of $\Omega$ that are either countable or have an uncountable complement. Let $\mu(A)=0$ if $A$ is countable and $\mu(A)=1$ if its complement is countable. Every set with countable complement is an $\mu$-atom, but the atoms of $\Sigma$ are the singletons which all have probability $0$. Note that $\Sigma$ is not countably generated.