By Central Limit Theorem, the probability density function of the the sum of a large independent random variables tends to a Normal. Therefore can we say that the sum of a large number of independent Cauchy random variables is also Normal?
Solved – Is the sum of a large number of independent Cauchy random variables Normal
cauchy distributioncentral limit theoremrandom variable
Related Solutions
You need to know what a stochastic process is. In this context, it's just a collection of random variables $(X_0, X_1, X_2, \ldots)$.
Seeing a simple worked example may help. Let's set it up. Suppose you have a collection of independent variables $\mathbf Y = (Y_0, Y_1, \ldots)$, all with the same distribution. For instance, each $Y_i$ could represent the flip of a fair coin using (say) $1$ for heads and $0$ for tails. That's a stochastic process (which we could call a "Bernoulli process").
You can construct new processes out of old. One way is to convert $\mathbf Y$ into its cumulative sum
$$\mathbf X = (Y_0, Y_0+Y_1, Y_0+Y_1+Y_2, \ldots)$$
This is a random walk.
As an example, let's consider a finite random walk of length $3$ based on fair coins. That Bernoulli process $\mathbf Y$ has eight possible outcomes, aka "walks" or "paths," each with equal probabilities of $1/8$:
$$(0,0,0),\ (0,0,1),\ (0,1,0),\ (0,1,1),\ (1,0,0),\ (1,0,1),\ (1,1,0),\ (1,1,1).$$
The associated paths of $\mathbf X$, computed by taking cumulative sums, are therefore
$$(0,0,0),\ (0,0,1),\ (0,1,1),\ (0,1,2),\ (1,1,1),\ (1,1,2),\ (1,2,2),\ (1,2,3).$$
If you like, you can now identify the component variables $X_i$. For instance, $X_0$ takes on the value $0$ four times, for a total probability of $4\times 1/8=1/2$, and the value $1$ four times, for a total probability of $1/2$. $X_1$ takes on the values $0, 1,$ and $2$ with probabilities $1/4, 1/2, 1/4$, respectively. And $X_2$ takes on the values $0,1,2,3$ with probabilities $1/8, 3/8, 3/8, 1/8$, respectively. Notice that these three variables do not have identical distributions. The distributions have different means and variances, too: their means are $1/2, 1, 3/2$ (in order) and their variances are $1/4, 1/2, 3/4$ (in order).
The component variables in a random walk also are dependent. For instance, given that $X_1=0$ (which occurs only in the paths $(0,0,0)$ and $(0,0,1)$), the chance that $X_2=0$ is $1/2$. But given that $X_1=1$, the chance that $X_2=0$ is now zero: it's just not possible. Because these conditional probabilities vary with the value of $X_1$, $X_1$ and $X_2$ are not independent. In fact, no pair of these component variables is independent.
The Central Limit Theorem makes a statement about the distribution of $X_n$ when $n$ gets very large. Besides assuming the $Y_i$ (out of which the $X_n$ are constructed) are independent and identically distributed, it has to assume that this common distribution has a finite variance. The concept of a stochastic process is separate from any idea of limits (which wouldn't even make sense for a finite one, as in the example). The CLT holds only for very special processes.
To understand this, you need to first state a version of the Central Limit Theorem. Here's the "typical" statement of the central limit theorem:
Lindeberg–Lévy CLT. Suppose ${X_1, X_2, \dots}$ is a sequence of i.i.d. random variables with $E[X_i] = \mu$ and $Var[X_i] = \sigma^2 < \infty$. Let $S_{n}:={\frac {X_{1}+\cdots +X_{n}}{n}}$. Then as $n$ approaches infinity, the random variables $\sqrt{n}(S_n − \mu)$ converge in distribution to a normal $N(0,\sigma^2)$ i.e.
$${\displaystyle {\sqrt {n}}\left(\left({\frac {1}{n}}\sum _{i=1}^{n}X_{i}\right)-\mu \right)\ {\xrightarrow {d}}\ N\left(0,\sigma ^{2}\right).}$$
So, how does this differ from the informal description, and what are the gaps? There are several differences between your informal description and this description, some of which have been discussed in other answers, but not completely. So, we can turn this into three specific questions:
- What happens if the variables are not identically distributed?
- What if the variables have infinite variance, or infinite mean?
- How important is independence?
Taking these one at a time,
Not identically distributed, The best general results are the Lindeberg and Lyaponov versions of the central limit theorem. Basically, as long as the standard deviations don't grow too wildly, you can get a decent central limit theorem out of it.
Lyapunov CLT.[5] Suppose ${X_1, X_2, \dots}$ is a sequence of independent random variables, each with finite expected value $\mu_i$ and variance $\sigma^2$ Define: $s_{n}^{2}=\sum _{i=1}^{n}\sigma _{i}^{2}$
If for some $\delta > 0$, Lyapunov’s condition ${\displaystyle \lim _{n\to \infty }{\frac {1}{s_{n}^{2+\delta }}}\sum_{i=1}^{n}\operatorname {E} \left[|X_{i}-\mu _{i}|^{2+\delta }\right]=0}$ is satisfied, then a sum of $X_i − \mu_i / s_n$ converges in distribution to a standard normal random variable, as n goes to infinity:
${{\frac {1}{s_{n}}}\sum _{i=1}^{n}\left(X_{i}-\mu_{i}\right)\ {\xrightarrow {d}}\ N(0,1).}$
Infinite Variance Theorems similar to the central limit theorem exist for variables with infinite variance, but the conditions are significantly more narrow than for the usual central limit theorem. Essentially the tail of the probability distribution must be asymptotic to $|x|^{-\alpha-1}$ for $0 < \alpha < 2$. In this case, appropriate scaled summands converge to a Levy-Alpha stable distribution.
Importance of Independence There are many different central limit theorems for non-independent sequences of $X_i$. They are all highly contextual. As Batman points out, there's one for Martingales. This question is an ongoing area of research, with many, many different variations depending upon the specific context of interest. This Question on Math Exchange is another post related to this question.
Best Answer
No.
You're missing one of the central assumptions of the central limit theorem:
The Cauchy distribution does not have a finite variance.
In fact
So the situation in your question is quite clear cut, you just keep getting back the same Cauchy distribution.
Yes. A (strictly) stable distribution (or random variable) is one for which any linear combination $a X_1 + b X_2$ of two i.i.d copies is distributed proportionally to the original distribution. The Cauchy distribution is indeed strictly stationary.
(*) Quotations from wikipedia.