This may be better appreciated by expressing the result of CLT in terms of sums of iid random variables. We have
$$\sqrt{n} \frac{ \bar{X} -\mu}{\sigma} \sim N(0, 1) \quad \text{asymptotically}$$
Multiply the quotient by $\frac{\sigma}{\sqrt{n}}$ and use the fact that $Var(cX) = c^2 Var(X)$ to get
$$\bar{X}-\mu \sim N\left(0, \frac{\sigma^2}{n} \right)$$
Now add $\mu$ to the LHS and use the fact that $\mathbb{E} \left[a X+\mu\right] = a \mathbb{E}[X] + \mu$ to obtain
$$\bar{X} = \frac{1}{n} \sum_{i=1}^n X_i \sim N\left(\mu, \frac{\sigma^2}{n} \right)$$
Lastly, multiply by $n$ and use the above two results to see that
$$\sum_{i=1}^n X_i \sim N \left(n \mu, n\sigma^2 \right) $$
And what does this have to do with Wooldridge's statement? Well, if the error is the sum of many iid random variables then it will be approximately normally distributed, as just seen. But there is an issue here, namely that the unobserved factors will not necessarily be identically distributed and they might not even be independent!
Nevertheless, the CLT has been successfully extended to independent non-identically distributed random variables and even cases of mild dependence, under some additional regularity conditions. These are essentially conditions that guarantee that no term in the sum exerts disproportional influence on the asymptotic distribution, see also the wikipedia page on the CLT. You do not need to know these results of course; Wooldridge's aim is merely to provide intuition.
Hope this helps.
Best Answer
In its standard simplest form, the Central Limit Theorem (CLT) is a statement about the cumulative distribution function of the random variable $$Z_n = \frac{X_1 + X_2 + \cdots + X_n -n\mu}{\sigma \sqrt{n}}$$ where the $X_i$ are independent identically distributed random variables with mean $\mu$ and standard deviation $\sigma$. The CLT asserts that for each $a$, $-\infty < a < \infty$, $$F_{Z_n}(a) = P\left\{\frac{X_1 + X_2 + \cdots + X_n -n\mu}{\sigma \sqrt{n}} \leq a \right\} \to \Phi(a) = \int_{-\infty}^a \frac{e^{-x^2/2}}{\sqrt{2\pi}}\mathrm dx$$ as $n \to \infty$.
If by "error distribution" you mean the distribution function of $$Y_n = \left(\frac{1}{n}\sum_{i=1}^n X_i\right) -\mu = \frac{\sigma}{\sqrt{n}}Z_n,$$ that is, the difference of the sample mean $\bar{X} = n^{-1}\sum_iX_i$ and the population mean $\mu$, then the CLT certainly does not imply that $F_{Y_n}(\cdot)$ "approaches normality" as the sample size $n$ grows large in the usual sense of normality, though nitpickers may want to claim that the distribution is approaching a normal distribution with mean $0$ and standard deviation $0$ (often called a constant by statistically illiterate people).
On the other hand, the mean of the sample error is not a random variable but a constant (in fact, $0$ since the sample mean is an unbiased estimator of the population mean) and does not need to approach $0$; it is already there! I think what you meant to say is that the distribution $F_{Y_n}(a)$ of the sample error approaches the unit step function: $$F_{Y_n}(a) \to u(a) = \begin{cases}1, & \text{if}~a > 0,\\ 0 &\text{if}~a < 0,\end{cases}$$ which is certainly correct, and follows from the CLT, but also follows from results such as the weak law of large numbers which makes no assertions about the distribution of $Z_n$, only about $Y_n$.