Probability Theory – Gaussian Process with Independent Increments

brownian motioncontinuitynormal distributionprobability theorystochastic-processes

Suppose that we have a continuous Gaussian process $(X_t)_{t \ge 0}$ with independent increments and $X_0=0$. If the increments are also identically distributed, meaning that $X_b-X_a \stackrel{D}{=} X_t-X_s$ whenever $b-a=t-s$, then $\mathbb{E}[X_t]=ct$ and $Cov(X_t,X_s)=\sigma (s \wedge t)$, where $c$ and $\sigma$ are constants.


Let $a(t)=\mathbb{E}[X_t]$. Since $X_0=0$, we have that the function $a$ satisfies $a(t+s)=a(t)+a(s)$, for all $s,t \ge 0$. Also, notice that if $s<t$, then $Cov(X_t,X_s)=Cov((X_t-X_s)+(X_s-X_0),X_s-X_0)=Var(X_s)$. Putting $b(t)=Var(X_t)$ implies by independence that $b(x+y)=b(x)+b(y)$, for all $x,y \ge 0$. So, if we proof that $a(t)$ and $b(t)$ are continuous functions, we are done, since every continuous function $f$ which satisfies the identity $f(u+v)=f(v)+f(u)$ for all $u, v \in \mathbb{R}$ is linear.

So, how can I proof that $\mathbb{E}[X_t]$ and $Var(X_t)$ are continuous functions on $t$?

Best Answer

Let $t_0 \in \mathbb{R}$. Since the trajectories are continuous, we have $\lim \limits_{t \to t_0} X_t(\omega)=X_{t_0}(\omega)$. But, we know that the pointwise convergence implies convergence in distribution. By Paul Lévy theorem, for all sequence $(t_k)_{k \ge 1}$ such that $\lim \limits_{k \to \infty}t_k=t_0$, one has

$$\varphi_{X_{t_k}} \to \varphi_{X_{t_0}} \iff X_{t_k} \stackrel{D}{\to} X_{t_0}$$

Then, since $X_t$ is normal distributed for all $t \ge 0$, it follows that $\mathbb{E}[X_{t_k}]\to \mathbb{E}[X_{t_0}] $ and $Var[X_{t_k}]\to Var[X_{t_0}] $. Since this is valid for all sequences $(t_k)_{k \ge 1}$ such that $\lim \limits_{k \to \infty}t_k=t_0$, one can conclude that the functions given by $a(t)=\mathbb{E}[X_t]$ and $b(t)=Var(X_t)$ are continuous in $t$.

Related Question