How to Prove Normal Distribution with Selberg’s Central Limit Theorem

analytic-number-theorygaussiannt.number-theorypr.probabilityprobabilistic-number-theory

Given an random variable $Y:\Omega \to \mathbb{R}$ with finite mean $\mu$ and finite, positive variance $\sigma^2$, let $X = \frac{Y-\mu}{\sigma}$ be the renormalization with mean $0$ and variance $1$. what are some general techniques for showing that $Y$ has a normal distribution? That is,
$$P(X\leqslant a) = \frac{1}{\sqrt{2\pi}}\int_{-\infty}^a e^{-t^2/2}\,dt.$$

The standard technique I know is to compute the moments or cumulants and then use the fact that the normal distribution is characterized by its moments/cumulants. Are there any other general techniques, and what are their advantages and disadvantages?


The motivation for this question is number theoretic (as with this related question), hence the number theory tags. Specifically, the motivating theorem is Selberg's central limit theorem (first published in Tsang's thesis, see also this article of Radziwill-Soundararajan) which states that for large $T$, the real valued random variable on $[T,2T]$ given by $t \mapsto \log|\zeta(\tfrac12 + it)|$ is approximately normally distributed with mean $0$ and variance $\frac{1}{2}{\log\log T}$.

Both the proofs I know of (Selberg's original proof, and that of Radziwill-Soundararajan) use the method of moments. Morally speaking, the analytic-number-theoretic input goes into showing that the contributions from the zeros of zeta can be controlled, and hence at least for the distributional question with $t \in [T,2T]$,
$$\log|\zeta(\tfrac12 + it)| \simeq \Re\sum_{p\leqslant T^{o(1)}} \frac{1}{p^{1/2 + it}}.$$
One can then compute the moments of the right hand side and show that as $T \to \infty$, the moments appropriately normalized converge to the moments of a standard Gaussian.

The hope is to see if there's a way to prove Selberg's CLT in a situation where the moments are harder to compute, and so the method of moments may not be tractable.

Best Answer

There are multiple books about ways to characterize the normal distribution. For instance, Bryc’s book starts with Herschel-Maxwell’s theorem:

If $X$ and $Y$ are independent variables whose joint distribution is rotationally invariant, then $X$ and $Y$ are both normal.

He immediately notes that one can strengthen this to Polya’s theorem:

If $X$ and $Y$ are independent variables, and rotations of $\pi/4$ and $\pi/2$ leave the distribution of $X$ invariant, then $X$ and $Y$ are both normal.

Perhaps somewhere in such books you’ll find a characterization that avoids moments but is number-theoretically tractable.

Related Question