[Math] Convergence in Distribution of Sums of Random Variable

central limit theoremprobabilityprobability distributionsprobability theorystatistics

Suppose I have $X_1,X_2,…,X_n$ random variables that are independent and identically distributed, from ANY distribution. Suppose that $E(X_i)=\mu$ and $V(X_i)=\sigma^2$.

Suppose I define the following random variable:

$$Y=\sum_{i=1}^nX_i$$

What is the limiting distribution of $Y$? That is, as $n$ goes to infinity, what distribution can $Y$ be approximated by?

My intuation tells me that $Y\rightarrow N(n\mu,n\sigma^2)$. In other words, say $200$ was a sufficiently large number for $n$. Then I could approximate $Y$ by a normal distribution with mean $200\mu$ and variance $200\sigma^2$. Is this true, and if so, how can you prove it? If not, what is the limiting distribution of $Y$?

Best Answer

Any statement that says $\lim_{n\to\infty}(\cdots\cdots) = (\text{something depending on $n$})$ is wrong if taken literally, and usually wrong if taken any other way.

The distribution $N(n\mu,n\sigma^2)$ depends on $n$ and does not approach a limit as $n$ grows.

However, the distribution of $$ \frac{Y-n\mu}{\sigma\sqrt n} \tag 1 $$ does approach a limit as $n$ grows (unless $\sigma=+\infty,$ as happens in some cases). That limit is $N(0,1).$

This may be understood as meaning that the c.d.f. of $(1)$ converges pointwise to the c.d.f. of $N(0,1).$ If the limit were a distribution that concentrates positive probability at some points, it would be understood as meaning that the c.d.f. converges pointwise except at points where the limiting distribution assigns positive probability.