Generating a ‘path’ of uniformly distributed continuous random variables

monte carloprobability distributionsstochastic-processesuniform distribution

Let $X$, $Y$ be independent continuous random variables both uniformly distributed on $[0,1]$. Is there a function $F(t;X,Y):[0,1]^{\times 3}\rightarrow [0,1]$ satisfying:

  1. $F$ is continuous in $t$, with $F(0)=X$ and $F(1)=Y$;
  2. For any $t_0\in [0,1]$, $F(t_0;X,Y)$ is uniformly distributed?

In a sense, this is a path in the space of uniformly distributed measurable functions. Note, each individual condition is rather easily achievable:

Example 1. Let $F(t;X,Y)=(1-t) X+tY$. It is obvious that $F$ is continuous in $t$, with $F(0)=X$ and $F(1)=Y$. However, for $t_0\neq 0,1$, $F(t_0;X,Y)$ is not uniformly distributed, but rather trapezoidally distributed on $[0,1]$.

Example 2. Let $F(t;X,Y)=\max((1-t),t)^{-1}((1-t) X+tY)\ {\rm mod}\ \max(1-t,t)$. For any $t_0\in [0,1]$, $F(t_0;X,Y)$ is uniformly distributed on $[0,1]$; heuristically, note that the probability density function of $(1-t) X+tY$ is trapezoidal, and the modulus in essence combines triangular portions of the density complementarily to obtain a uniform density function. However, this is typically not continuous in $t$ due to the modulus function.

Motivation: Here $F=F(t)$ is a parameter to a Monte Carlo simulation, and $t$ is a deterministic input chosen by the user. It is known that the parameter $F$ should be uniformly distributed at each input $t$, and that it is independent for $t$ far enough away from one another, but it is not clear how values should be correlated otherwise. Also, one would expect $F$ to be continuous (or at least $\epsilon$-close to a continuous function for some reasonably small $\epsilon$).The question posed to me is whether there is a sensible way to use draws for $F(0)$ and $F(1)$ to deterministically populate $F(t)$ for other choices of $t$ so that it represents the correct distribution, and it seemed to me an obvious and interesting question with no easy obvious answer, but I am not an expert in probability theory.

Best Answer

This is possible!

I start with a Gaussian process (GP). This is simpler since those are uniquely determined by covariance function and mean function.

The GP is a function $G:\Omega\times [0,1]\rightarrow\mathbb{R},\, (t,\omega)\mapsto G_t(\omega)$ such that for every every choice ${t_1}, \ldots {t_n}$ the random variables $G_{t_1}, \ldots G_{t_n}$ have a joint Gaussian distribution. Choose the mean function of $G$ as zero and define the covariance as $$ \text{Cov}[G_t,G_s]=\max\left(0, 1-2 \left\lvert t -s\right\lvert\right).$$

Note that $Cov[G_0,G_1]=0$, i.e. they are uncorrelated. Since $G_0,G_1$ are Gaussian they are also independent. Each sample path $G(\omega)$ is a continuous function this can be seen from correlation function (see the Wikipedia article).

Finally, this Gaussian process must be converted to one with uniform marginals. This is performed by the Gaussian cumulative distribution function $\Phi$. Simply set $U_t=\Phi(G_t).$ Then each margin is uniform and continuous. The independence remains as well.