Modifying the covariance of Brownian motion, what Gaussian process do we get

probabilityprobability theorystochastic-analysisstochastic-calculusstochastic-processes

If $(B_t)$ is a Brownian motion, then $Cov(B_t,B_s)=min(t,s)$.

Take a Gaussian process $(X_t)$ with mean $0$ and covariance $Cov(X_t,X_s)=f(min(t,s))$ for a given function $f$ such that the covariance is still positive definite. Is $X$ related to Brownian motion ?

For example, let's take an easy function: $f(x)=x+1$, then clearly $X_t=B_{t+1}$, or more generally, if $f$ is monotone increasing, then $X_t=B_{f(t)}= \int_0^t \sqrt{f'(s)} \, dB_s$ (where the last equality is valid under integrability/differentiability conditions).

But what happens when $f$ is not monotone ? For example, $Cov(X_t,X_s)=min(t,s)(1-min(t,s))$. This is a covariance function on $[0,1]$. Can we describe $X$ using Brownian motion on $[0,1]$?

Idea: decompose $f$ on intervals on which it is increasing and decreasing. Use the above for the increasing parts. But what happens when $f$ is decreasing ? For our example that would be $f(x)=x(1-x)$ on $[0.5,1]$ ?

Best Answer

The function $f(x)=x(1-x)$ applied to $\min(x,y)$ is not a covariance function since it is not positive definite on $[0.5,1]$. To see this, just calculate the determinant of the "covariance" matrix $C$ of the points $v=0.6$ and $w=0.8$. You find $C(v,v)=0.6*0.4=0.24$, $C(w,w)=0.8*0.2=0.16$ and $C(v,w)=f(0.6)=0.24.$ Now $\det C= 0.24 * 0.16 - (0.24)^2<0.$

This example gives already a good idea of what is going on or better going wrong with non-monotonic $f$. In fact I claim:

If $f$ is such that there exist points $v,w$ with $v< w$ and $f(v)> f(w)$ then $C(x,y)=f(\min(x,y))$ is not positive definite.

The first necessary condition is $$C(x,x)=f(x)>0$$ for all $x$ in the domain of $f$. We assume this holds and calculate $$C(v,v)C(w,w) - C(v,w)^2=f(v)f(w) - f(v)^2 = f(v)( f(w) - f(v))<0$$ by assumption on $f$.