Note that if $X\sim U(0,1)$ then $-\ln(X)\sim Exp(1)$ and so $$E(-\ln(X))=V(-\ln(X))=1$$ This implies $\ln(Y_n)=\frac{1}{n}\sum_{i=1}^n\big[-\ln(X_i)\big]$ converges in distribution to a $N(1,1/n)$ random variable by the central limit theorem. Equivalently, $$\sqrt{n}\Big(\ln(Y_n)-1\Big)\rightarrow N(0,1)$$ Apply the delta$-$ method with $g(x)=e^x$ to get $$\sqrt{n}\Big(g\big[\ln(Y_n)\big]-g(1)\Big)\rightarrow N\Big(0,(1\cdot g'(1))^2\Big)$$ which is what you're looking for.
You are correct that the classical CLT assumes iid random variates with finite mean and variance. The typical SLLN also requires iid, or at least each variable needs to have constant mean and finite variance.
There are generalized versions of each that allow you to deal with certain types of "nonstationary" populations.
For both, we will use the following sequence of independent random variables:
$$E[X_i]=\mu_i, V[X_i]=\sigma_i^2,\;\;i\in \mathbb{N}$$
And define the following quantities:
$$S_n = \sum_{i=1}^n X_i,\;m_n := \sum_{i=1}^n \mu_i,\;s^2_n := \sum_{i=1}^n \sigma_i^2,\;\;s_n = \sqrt{s_n^2} $$
Lindeberg-Feller CLT
There is a CLT where your variables only need to be independent but not identically distributed. Then, if the following condition is met:
Lindeberg Condition
Let
$$L_{\epsilon}(n) := \frac{1}{s_n^2}\sum_{i=1}^n E[(X_i-\mu_i)^2\cdot 1_{|X_i-\mu_i|>\epsilon \sigma s_n}]$$
Then the condition is
$$\forall \epsilon > 0\;\; \lim_{n\to \infty} L_{\epsilon}(n) = 0$$
We can say that:
$$Z_n := \frac{S_n - m_n}{s_n} \xrightarrow{d} N(0,1)$$
What the Lindeberg condition is basically saying is that the contribution of each $X_i$ to the overall variability of $Z_n$ approaches zero as you add more terms.
General SLLN
A similar style theorem holds for almost sure convergence of the mean:
If
$$\left|\sum_{i=1}^{\infty} \frac{\sigma_i^n}{i^2}\right|\leq \infty$$
Then we have (more or less -- see link)
$$P\left(\lim_{n\to\infty}\frac{|S_n-m_n|}{n} = 0\right) = 1$$
Best Answer
For example, suppose $X_n$ are iid standard normal random variables, and $U$ uniform on $[0,1]$ independent of the $X_n$. Let $T_n = n$ if $U$ is in an interval $[a_n, a_n+1/n] \subset [0,1]$ where the $a_n$ are arranged so every point of $[0,1]$ is in infinitely many of the intervals. Let $Y_n = X_n + T_n - T_{n-1}$ (with $Y_1 = X_1 + T_1$). Then the partial sums $S_n = \sum_{i=1}^n Y_n = T_n + \sum_{i=1}^n X_n$ satisfy the conclusion of the Central Limit Theorem in that $S_n/\sqrt{n}$ converges in distribution to the standard normal distribution, but does not satisfy the conclusion of the Strong Law of Large Numbers in that $S_n/n$ does not converge a.s. to $0$, indeed almost surely $\limsup_{n \to \infty} S_n/n \ge 1$.