Does a bounded section of the normal distribution converge to the uniform distribution

convergence-divergencenormal distributionprobability distributions

It was once asked on CrossValidated whether the normal distribution converges to a uniform distribution when the standard deviation grows to infinity. (The answer was no.) I am curious about a related yet slightly different question. Suppose I have an arbitrarily chosen fixed interval $[A, B)$. One can consider the uniform distribution on that interval, with density $1/(B-A)$ for $A \le x < B$. One can also consider the centered normal distribution $\mathcal{N}(0, \sigma^2)$, with pdf $f(x)$. But we are instead interested in a distribution defined by truncating this distribution to $[A, B)$, with pdf $g(x) \propto f(x)$ for $A \le x < B$ and 0 elsewhere. As $\sigma \rightarrow \infty$, does $g(x)$ converge to the uniform distribution with density $1/(B-A)$?

Best Answer

Yes, even uniformly. Nothing fancy is needed, we just need to formalize the intuitive idea that as $\sigma \to \infty$ the Gaussian density on any fixed interval becomes closer and closer to constant.

On the interval $[A, B]$ the (unnormalized) Gaussian density $f(x) = \exp \left( - \frac{x^2}{\sigma^2} \right)$ is bounded from above by $1$ and bounded from below by $\exp \left( - \frac{\text{max}(A^2, B^2)}{\sigma^2} \right)$ (this slightly awkward expression is needed to handle the case that $A$ is negative and $B$ is positive), and as $\sigma \to \infty$ the lower bound converges to $1$. This gives that the normalized density is bounded from above and below by

$$ \frac{\exp \left( - \frac{\text{max}(A^2, B^2)}{\sigma^2} \right)}{B - A} \le g(x) = \frac{\exp \left( - \frac{x^2}{\sigma^2} \right)}{\int_A^B \exp \left( - \frac{x^2}{\sigma^2} \right) } \le \frac{1}{(B - A) \exp \left( - \frac{\text{max}(A^2, B^2)}{\sigma^2} \right)}$$

so as $\sigma \to \infty$ we see that $g(x)$ converges uniformly to $\frac{1}{B - A}$ as desired.

This argument shows that $A$ and $B$ don't even need to be fixed and can grow slowly (sublinearly) with $\sigma$, e.g. we could have $A = -B, B = O(\sqrt{\sigma})$.

Related Question