Wikipedia says –
In probability theory, the central limit theorem (CLT) establishes that, in most situations, when independent random variables are added, their properly normalized sum tends toward a normal distribution (informally a "bell curve") even if the original variables themselves are not normally distributed…
When it says "in most situations", in which situations does the central limit theorem not work?
Best Answer
To understand this, you need to first state a version of the Central Limit Theorem. Here's the "typical" statement of the central limit theorem:
So, how does this differ from the informal description, and what are the gaps? There are several differences between your informal description and this description, some of which have been discussed in other answers, but not completely. So, we can turn this into three specific questions:
Taking these one at a time,
Not identically distributed, The best general results are the Lindeberg and Lyaponov versions of the central limit theorem. Basically, as long as the standard deviations don't grow too wildly, you can get a decent central limit theorem out of it.
Infinite Variance Theorems similar to the central limit theorem exist for variables with infinite variance, but the conditions are significantly more narrow than for the usual central limit theorem. Essentially the tail of the probability distribution must be asymptotic to $|x|^{-\alpha-1}$ for $0 < \alpha < 2$. In this case, appropriate scaled summands converge to a Levy-Alpha stable distribution.
Importance of Independence There are many different central limit theorems for non-independent sequences of $X_i$. They are all highly contextual. As Batman points out, there's one for Martingales. This question is an ongoing area of research, with many, many different variations depending upon the specific context of interest. This Question on Math Exchange is another post related to this question.