If $X$ is Gaussian, prove that $X-\lfloor X \rfloor \sim U(0,1)$ as its variance becomes large

central limit theoremprobabilityuniform distribution

I have a normal distributed random variable, $X$ with mean $\mu$ and standard deviation, $\sigma$. I don't believe it matters, but this distribution was obtained as a result of summing a large number of independent, identically distributed random numbers with finite variance (hence invoking the central limit theorem).

It seems intuitive that $X – \lfloor X \rfloor$ should become closer and closer to a uniform random number between $(0,1)$ as the variance of $X$ increases. And in the limit, it should become a uniform random number. Is there a proof for this claim or a refutation of it?


Context: this is going to help "complete" the accepted answer here: As the variance of a random variable grows, the conditional distribution of it residing in an interval of length $1$ becomes uniform. Larger picture, I'm trying to prove Blackwell's theorem from renewal theory. See here for details: Going "well into the lifetime" of a renewal process means the time until the next event will be uniform conditional on inter-arrival?

Best Answer

Here I confirm the claim about the given weak convergence to $U[0,1]$.
Moreover, I also present a upper bound for the convergence rate, which shows the underlying convergence is extremely rapid.


For simplicity, WLOG $\mu=0$
Let $(X_{\sigma} , \sigma \in \mathbb{R}_+)$ be the respective sequence of random variables, $f_{\sigma}$ be the density function of $\{ X_{\sigma} \}$
For any positive number $\sigma $ and integer $n$, we have: $$\int_{0}^1 f_{\sigma}(t)e^{-2i\pi n t} dt= \mathbb{E}(e^{-2i\pi n X_{\sigma}}) = e^{-2\pi^2 n^2 \sigma^2}$$ Because $\sum_{n \in \mathbb{Z}} \left| e^{-2\pi^2 n^2 \sigma^2}\right|^2<\infty$, so according to Riesz-Fischer theorem, we have that $$f_{\sigma} \in L^2([0,1])$$ And this gives ,by Perseval's identity: $$\int_{0}^1 |f_{\sigma}(t)-1|^2dt=\sum_{n \in \mathbb{Z}} \left| e^{-2\pi^2 n^2 \sigma^2}-\mathbb{1}_{\{n=0\}}\right|^2=\sum_{n \ge 1}2e^{-4\pi^2n^2\sigma^2}\xrightarrow[]{\sigma \rightarrow \infty} 0$$ Hence, $$\{ X_{\sigma} \} \xrightarrow[\sigma \rightarrow \infty]{\text{(d)}} \mathcal{U}([0,1])$$ And in particular,

For any function bounded measurable function $g$, we imply the rate of convergence

$$\left| \mathbb{E}( g( \{X_{\sigma}\}))-\int_0^1 g(x)dx \right| \le \|g\|_{\infty}e^{-2\pi^2 \sigma^2}\sqrt{\frac{2}{1-e^{-4\pi^2 \sigma^2}}}$$

The rate of convergence is the exponential of minus $\sigma^2$, hence the rate is extremely rapid. $\square$


Side note Using density argument (replace $X$ in the following statement with any random variable whose density function is in $\mathcal{C}^{2}_{c}$), we can deduce an even more generalized result

Lemma Let $X$ be any random variable with density, $f_{\sigma}$ be the density function of the random variable $\{\sigma X\}$ , then $$\lim_{\sigma \rightarrow \infty} \int_{0}^1 |f_{\sigma}(t)-1|dt = 0$$ In particular, $$ \{ \sigma X\} \xrightarrow[\sigma \rightarrow \infty]{\text{(d)}} \mathcal{U}([0,1])$$

Related Question