[Math] If $B(t)$ is Brownian motion then prove $W(t)$ defined as follows is also Brownian motion

brownian motionprobability theoryself-learningstochastic-processes

Let $B(t)$ be standard Brownian motion on $[0.1]$

Define $W(t)$ as follows

$W(t) = B(t) – \int_0^t \frac{B(1)-B(s)}{1-s} \, ds$

Prove $W(t)$ is also Brownian motion

So I'm not sure how to deal with the integral here. In order to show it, too, is Brownian motion I think I would need to

  1. Make an argument that the transformation is linear and hence also Gaussian
  2. Show that $E[W(t_1) – W(t_2)] = 0$ for any $t_1>t_2>0$
  3. Show the variance of $W(t_1) – W(t_2) = t_1 – t_2$ or an equivalent covariance function
  4. Finally make an argument that separate increments are independent. Though this would follow trivially I think from the fact $B(t)$ is brownian motion with independent increments.

But yeah, how do I deal with the integral?

Best Answer

Recall the following characterization of (one-dimensional) Brownian motion

A stochastic process $(W_t)_{t \geq 0}$ is a Brownian motion, if and only if,

  1. $(W_t)_t$ has continuous sample paths.
  2. $(W_t)_t$ is a Gaussian process with mean $0$ and covariance $\mathbb{E}(W_s W_t) = \min\{s,t\}$ for all $s,t \geq 0$.

As $(W_t)_t$ has obviously continuous sample paths, we just have to check the second property.

Since $(B_t)_{t \geq 0}$ is a Brownian motion, it is in particular a Gaussian process and so

$$B_t - \sum_{j=0}^{n-1} (B_1-B(t_j)) \frac{1}{1-t_j} (t_{j+1}-t_j)$$

is Gaussian for each $n \in \mathbb{N}$ where $t_j := \frac{t}{n} j$. If we let $n \to \infty$, then we get

$$W_t = \lim_{n \to \infty} \left( B_t - \sum_{j=0}^{n-1} (B_1-B(t_j)) \frac{1}{1-t_j} (t_{j+1}-t_j) \right)$$

is Gaussian as a limit of Gaussian random variables. Since this argumentation applies in exactly the same way to the joint distributions $(W_{s_1},\ldots,W_{s_m})$ where $s_j \geq 0$, we get that $(W_t)_{t \geq 0}$ is a Gaussian process. It remains to check mean and covariance.

By Fubini's theorem, we have

$$\begin{align*} \mathbb{E}(W_t) &= \underbrace{\mathbb{E}(B_t)}_{0} - \mathbb{E} \left( \int_0^t\frac{B_1-B_s}{1-s} \, ds \right) = - \int_0^t \underbrace{(\mathbb{E}(B_1-B_s)}_{0} \frac{1}{1-s} \, ds = 0. \end{align*}$$

Now fix $r \leq t$.

$$\begin{align*} \mathbb{E}(W_r W_t) &= \mathbb{E}(B_t B_r)- \mathbb{E} \left( B_t \int_0^r \frac{B_1-B_s}{1-s} \, ds \right) - \mathbb{E} \left( B_r \int_0^t \frac{B_1-B_s}{1-s} \, ds \right) \\ &\quad + \mathbb{E} \left( \int_0^t \int_0^r \frac{B_1-B_u}{1-u} \frac{B_1-B_v}{1-v} \, du \, dv \right) \\ &=: \mathbb{E}(B_r B_t) +I_2+I_3+I_4 \end{align*}$$

If we can show that $$I_2+I_3+I_4 = 0$$ we are done. Using $\mathbb{E}(B_u B_v) = \min\{u,v\}$ for any $u,v \in [0,1]$ and Fubini's theorem, we find

$$ \begin{align*} I_2 &= \int_0^r \frac{\mathbb{E}(B_1 B_t-B_tB_s)}{1-s} \, ds = \int_0^r \frac{t-s}{1-s} \, ds \\ &= - \log (1-r) t + r + \log(1-r) \end{align*}$$

as $r \leq t$. Similarly,

$$\begin{align*} I_3 &= \int_0^t \frac{r- \min\{r,s\}}{1-s} \, ds = \int_0^r \frac{r-s}{1-s} \, ds + \int_r^t \underbrace{\frac{r-r}{1-s}}_{0} \, ds = \int_0^r \frac{r-s}{1-s} \, ds \\ &= (1-\log(1-r)) r + \log(1-r) \end{align*}$$

and, finally,

$$\begin{align*} I_4 &= \int_0^t \int_0^r \frac{1-v-u+ \min\{u,v\}}{(1-u)(1-v)} \, du \, dv \\ &= \int_r^t \int_0^r \frac{1-v-u+ u}{(1-u)(1-v)} \, du \, dv + \int_0^r \int_0^r \frac{1-v-u+ \min\{u,v\}}{(1-u)(1-v)} \, du \, dv \\ &= (t-r) \int_0^r \frac{1}{1-u} \, du + 2 \int_0^r \int_v^r \frac{1}{1-v} \, du \, dv\\ &= -(t-r) \log(1-r) + 2 ((1-\log(1-r))r + \log(1-r)) \end{align*}$$

where we have used in the penultimate equation that

$$\begin{align*} \int_0^r \int_0^r \frac{1-v-u+ \min\{u,v\}}{(1-u)(1-v)} \, du \, dv &= \int_0^r \int_0^v \frac{1}{1-u} \, du \, dv + \int_0^r \int_v^r \frac{1}{1-v} \, du \, dv \\ &= \int_0^r \int_v^r \frac{1}{1-u} \, dv \, du + \int_0^r \int_v^r \frac{1}{1-u} \, dv \, du \\ &= 2 \int_0^r \int_v^r \frac{1}{1-v} \, du \, dv. \end{align*}$$

Adding all up, we get $I_2+I_3+I_4 = 0$ and this finishes the proof.