Computing the finite-dimensional marginal distributions of Brownian Bridge

brownian motionmarginal-distributionprobabilitystochastic-processes

I'm working through Le Gall's Brownian Motion, Martingales, and Stochastic Calculus, and I'm struggling on an exercise. The question concerns computing the finite-dimensional marginal distributions of a Brownian bridge. In particular, let $B_{t}$ be a Brownian motion on $[0,1]$ or $\mathbb{R}^{+}$ (doesn't matter which), and for $t\in [0,1]$ define the Brownian Bridge to be $W_t = B_t – t B_1$.

I've shown that $W_t$ is a centered Gaussian process with covariance function $K(s,t) = \min\{s,t\}- st$.

I'm now asked to prove that for $0<t_1<\cdots<t_p<1$, the law of $(W_{t_1}, \dots, W_{t_p})$ has density
$$
g(w_1, \dots, w_p) = \sqrt{2\pi} \,p_{t_1}(w_1)\,p_{t_2 – t_1}(w_2 – w_1)\,\cdots\, p_{t_p – t_{p-1}}(w_p-w_{p-1})\,p_{1-t_{p}}(-w_p),
$$

where
$$
p_{t}(w)= \frac{1}{\sqrt{2\pi t}}\exp(-w^2/2t).
$$

Solution Progress:

Attempt 1:

The density is factored into a bunch of products of Gaussian densities, where the variance of each is $t_{i}-t_{i-1}$. This makes me want to relate the vector of Brownian Bridge terms $(W_{t_1}, \dots, W_{t_p})$ to either the vector of Brownian motion $(B_{t_1}, \dots, B_{t_p})$ or to the independent increments $(B_{t_1}-B_{0}, B_{t_2}-B_{t_1},\dots ,B_{t_p}-B_{t_{p-1}})$. Both of these vectors have densities which are a product of individual gaussians.

We note that
$$
\begin{pmatrix}
W_{t_p}\\
\vdots\\
W_{t_1}
\end{pmatrix}=
\begin{pmatrix}
B_{t_p}-t_{p}B_{1}\\
\vdots\\
B_{t_1}-t_{1}B_{1}
\end{pmatrix}=
\begin{pmatrix}
-t_p&1&{}&{}&{}\\
-t_{p-1}&{}&1&{}&{}\\
\vdots&{}&{}&\ddots&{}\\
-t_1&{}&{}&{}&1
\end{pmatrix}
\begin{pmatrix}
B_{1}\\
B_{t_p}\\
\vdots\\
B_{t_1}
\end{pmatrix}
$$

I would love to use a change of variables, but the issue is that I'm required to bring in the additional $B_1$ term, and so the linear transformation above maps $p+1$-dimensional space into $p$ dimensional space. Thus, the determinant isn't defined. I'm not sure if I'm just being stupid, or this is really a problem?

Attempt 2:

Another approach I've thought of is that since the density $g(w_1, \dots, w_p)$ factors into densities of differences, let's first focus on the density of $(W_1 – W_{t_p}, \dots, W_{t_2} – W_{t_1}, W_{t_1})$. We have
\begin{align*}
\begin{pmatrix}
W_{1} – W_{t_p}\\
W_{t_p}-W_{t_{p-1}}\\
\vdots\\
W_{t_2}-W_{t_1}\\
W_{t_1}
\end{pmatrix}&=
\begin{pmatrix}
(B_{1}-B_{t_p})-(1-t_p)B_{1}\\
(B_{t_p}-B_{t_{p-1}})-(t_{p}-t_{p-1})B_{1}\\
\vdots\\
(B_{t_2}-B_{t_1})-(t_2-t_1)B_{1}\\
B_{t_1} – t_{1} B_{1}
\end{pmatrix}\\
&=
\bigg\{
\begin{pmatrix}
1&{}&{}\\
{}&\ddots&{}\\
{}&{}&1
\end{pmatrix}-
\begin{pmatrix}
(1-t_p)&\cdots&(1-t_p)\\
(t_p -t_{p-1})&\cdots&(t_p – t_{p-1})\\
\vdots&{}&\vdots\\
t_{1}&\cdots&t_1
\end{pmatrix}
\bigg\}
\begin{pmatrix}
B_{1}-B_{t_p}\\
B_{t_p}-B_{t_{p-1}}\\
\vdots\\
B_{t_2}-B_{t_1}\\
B_{t_1}
\end{pmatrix}.
\end{align*}

The matrix in curly braces is equal to
$$
\begin{pmatrix}
1&{}&{}\\
{}&\ddots &{}\\
{}&{}&1
\end{pmatrix}

\begin{pmatrix}
1-t_p\\
\vdots\\
t_1
\end{pmatrix} \begin{pmatrix}1&\cdots&1\end{pmatrix}.
$$

This is a rank-p $(p+1)\times (p+1)$ matrix. And thus, change of variables is not possible.

Attempt 3:

Let us first consider the density of the increments $W_{t_1}, W_{t_2}-W_{t_1}, \dots, W_{t_p}-W_{t_{p-1}}, W_{1}-W_{t_p}$, and factor it by successively conditioning
$$
g(W_{t_1}, W_{t_2}-W_{t_1},\dots, W_{1}-W_{t_p})= g(W_{1}-W_{t_p}|W_{t_{p-1}}-W_{t_{p-2}},\dots, W_{t_1})\cdots g(W_{t_1}).
$$

One can compute that
\begin{align*}
g(W_{t_1})&= \sqrt{2\pi} p_{t_1}(W_{t_1})p_{1-t_1}(-W_{t_1})\\
&= \frac{\sqrt{2\pi}}{\sqrt{2\pi t_1}\sqrt{2\pi (1-t_1)}}\exp\bigg( -\frac{W_{t_1}^{2}}{2t_1}\bigg) \exp\bigg(-\frac{W_{t_1}^{2}}{2(1-t_1)}\bigg)
\end{align*}

This is promising, as we have the desired form for a product of distributions. The next step is computing the conditional distributions $g(W_{t_2}-W_{t_1}|W_{t_1})$ onwards.

Best Answer

The simplest approach is to use the fact that the Brownian bridge $\{W_t,t\in[0,1]\}$ has the same distribution as $\{B_t,t\in[0,1]\}$ conditioned to $B_1 = 0$, which immediately gives the required density. This, however, requires a priori knowledge of the said fact (which is not too hard to prove though).

I will use another approach, which in a sense is reverse to this one. First note that the Brownian bridge $\{W_t = B_t - t B_1\}$ is independent of $B_1$, which can be checked by computing the covariance: in the Gaussian case, zero correlation implies independence.

Therefore, $$ f_{W_{t_1},\dots, W_{t_p}, B_1}(w_1,\dots,w_p,x) = f_{W_{t_1},\dots,W_{t_p}}(w_1,\dots,w_p)f_{B_1}(x). $$ Consequently, $$ f_{W_{t_1},\dots,W_{t_p}}(w_1,\dots,w_p) = \frac{f_{W_{t_1},\dots, W_{t_p}, B_1}(w_1,\dots,w_p,x)}{f_{B_1}(x)}.\tag{1} $$ Denoting $w_0= t_0 = 0$, $$ f_{W_{t_1},\dots, W_{t_p}, B_1}(w_1,\dots,w_p,x) = f_{B_{t_1},\dots, B_{t_p}, B_1}(w_1 + t_1 x,\dots,w_p+t_p x,x)\\ = \prod_{i=1}^{p} \frac{1}{\sqrt{2\pi (t_{i} - t_{i-1})}}\exp \Bigl\{-\frac{\big(w_{i} - w_{i-1}+(t_{i}-t_{i-1}) x\big)^2}{2(t_{i}-t_{i-1}) }\Bigr\} \\\times \frac{1}{\sqrt{2\pi (1-t_p)}}\exp \Bigl\{-\frac{\big(x(1-t_p)-w_p\big)^2}{2(1-t_p)}\Bigr\} \\ = \prod_{i=1}^{p} \frac{1}{\sqrt{2\pi(t_{i}-t_{i-1}) }} \exp \Bigl\{- \frac{(w_i-w_{i-1})^2}{2(t_i-t_{i-1})^2} - x(w_i-w_{i-1}) - \frac{x^2(t_i-t_{i-1})}{2}\Bigr\}\\ \times \frac{1}{\sqrt{2\pi(1-t_{p}) }} \exp \Bigl\{-\frac{{w_p}^2}{2(1-t_p)} + w_p x - \frac{x^2(1-t_p)}{2}\Bigr\} \\ = \prod_{i=1}^{p}\frac{1}{\sqrt{2\pi(t_{i}-t_{i-1}) }}\exp \Bigl\{-\frac 12 \frac{(w_i-w_{i-1})^2}{(t_i-t_{i-1})^2}\Bigr\}\\ \times \frac{1}{\sqrt{2\pi(1-t_{p}) }} \exp \Bigl\{-\frac{{w_p}^2}{2(1-t_p)} - \frac{x^2}{2}\Bigr\}\\ = \prod_{i=1}^{p} p_{t_{i}-t_{i-1}}(w_i-w_{i-1}) \cdot p_{1-t_p}(w_p) e^{-x^2/2}. $$ Plugging into (1), $$ f_{W_1,\dots,W_1}(w_1,\dots,w_p) = \sqrt{2\pi}\prod_{i=1}^{p} p_{t_{i}-t_{i-1}}(w_i-w_{i-1}) \cdot p_{1-t_p}(w_p), $$ as required.


One could simplify the computation: since the ratio in (1) is independent of $x$ and since everything is continuous, one can take $x=0$. By not doing so we have actually shown the independence once more.