The OP says
The central limit theorem states that the mean of i.i.d. variables, as N goes to infinity, becomes normally distributed.
I will take this to mean that it is the OP's belief that for i.i.d. random
variables $X_i$ with mean $\mu$ and standard deviation $\sigma$, the
cumulative distribution function $F_{Z_n}(a)$ of
$$Z_n = \frac{1}{n} \sum_{i=1}^n X_i$$
converges to the cumulative distribution function of $\mathcal N(\mu,\sigma)$,
a normal random
variable with mean $\mu$ and standard deviation $\sigma$. Or, the
OP believes minor re-arrangements of this formula, e.g. the distribution
of $Z_n - \mu$ converges to the distribution of $\mathcal N(0,\sigma)$,
or the distribution of $(Z_n - \mu)/\sigma$ converges to the distribution of $\mathcal N(0,1)$, the standard normal random variable. Note as an example
that these statements imply that
$$P\{|Z_n - \mu| > \sigma\} = 1 - F_{Z_n}(\mu + \sigma) + F_{Z_n}((\mu + \sigma)^-) \to 1-\Phi(1)+\Phi(-1) \approx 0.32$$
as $n \to \infty$.
The OP goes on to say
This raises two questions:
- Can we deduce from this the law of large numbers? If the law of large numbers says that the mean of a sample of a random variable's values equals the true mean μ as N goes to infinity, then it seems even stronger to say that (as the central limit says) that the value becomes N(μ,σ) where σ is the standard deviation.
The weak law of large numbers says that for i.i.d. random variables $X_i$
with finite mean $\mu$, given any $\epsilon > 0$,
$$P\{|Z_n - \mu| > \epsilon\} \to 0 ~~ \text{as}~ n \to \infty.$$
Note that it is not necessary to assume that the standard deviation is
finite.
So, to answer the OP's question,
The central limit theorem as stated by the OP does not imply
the weak law of large numbers. As $n \to \infty$, the OP's
version of the central limit theorem says that
$P\{|Z_n-\mu| > \sigma\} \to 0.317\cdots$ while
the weak law says that $P\{|Z_n-\mu| > \sigma\} \to 0$
From a correct statement of the central limit theorem, one can
at best deduce only a restricted form of the weak law of large numbers
applying to random variables with finite mean and standard
deviation. But the weak law of large numbers also holds for random
variables such as Pareto random variables with finite means but
infinite standard deviation.
I do not understand why saying that the sample mean converges
to a normal random variable with nonzero standard deviation is
a stronger statement than saying that the sample mean converges
to the population mean, which is a constant (or a random variable
with zero standard deviation if you like).
This may be better appreciated by expressing the result of CLT in terms of sums of iid random variables. We have
$$\sqrt{n} \frac{ \bar{X} -\mu}{\sigma} \sim N(0, 1) \quad \text{asymptotically}$$
Multiply the quotient by $\frac{\sigma}{\sqrt{n}}$ and use the fact that $Var(cX) = c^2 Var(X)$ to get
$$\bar{X}-\mu \sim N\left(0, \frac{\sigma^2}{n} \right)$$
Now add $\mu$ to the LHS and use the fact that $\mathbb{E} \left[a X+\mu\right] = a \mathbb{E}[X] + \mu$ to obtain
$$\bar{X} = \frac{1}{n} \sum_{i=1}^n X_i \sim N\left(\mu, \frac{\sigma^2}{n} \right)$$
Lastly, multiply by $n$ and use the above two results to see that
$$\sum_{i=1}^n X_i \sim N \left(n \mu, n\sigma^2 \right) $$
And what does this have to do with Wooldridge's statement? Well, if the error is the sum of many iid random variables then it will be approximately normally distributed, as just seen. But there is an issue here, namely that the unobserved factors will not necessarily be identically distributed and they might not even be independent!
Nevertheless, the CLT has been successfully extended to independent non-identically distributed random variables and even cases of mild dependence, under some additional regularity conditions. These are essentially conditions that guarantee that no term in the sum exerts disproportional influence on the asymptotic distribution, see also the wikipedia page on the CLT. You do not need to know these results of course; Wooldridge's aim is merely to provide intuition.
Hope this helps.
Best Answer
You also can use CLT directly,one form of CLT states:
$\frac{\sum_{i=1}^nX_i-n\mu}{\sigma\sqrt{n}}\sim N(0,1)=\Rightarrow\sum_{i=1}^nX_i\sim N(n\mu,n\sigma^2)$
Above equations invovle two theorems: The first one is one form CLT
The second related to multivariate normal distribution, but it also apply to 1-dimensional random vector.
For your case:
$\sum_{i=1}^k Y_i \sim N(\frac{k}{\pi},k\frac{1-\pi}{\pi^2})$