[Math] How to interpret the covariance matrix of Brownian motion

brownian motionnormal distributionprobability theorystochastic-processes

I'm reading Bernt Oksendal's "Stochastic Differential Equations".

It says, Brownian motion $B_t$ is Gaussian Process, i.e. for all $0 \leq t1 \leq \cdots \leq t_k$ the random variable
$Z = (B_{t_1}, \ldots, B_{t_k} ) \in \mathbb{R}^{nk}$ has a (multi)normal distribution with $M = E^x[Z]$ be the mean value of $Z$,
and $c_{jm} = E^x[(Z_j – M_j)(Z_m -M_m)]$ be the covariance matrix of $Z$.

The book shows

$$M=E^x[Z]=(x, x, \cdots, x)\in \mathbb{R}^{nk}$$

and
$$C=[c_{jm}]=\begin{pmatrix} t_1 I_n & t_1 I_n & \cdots & t_1 I_n \\ t_1 I_n & t_2 I_n & \cdots & t_2 I_n\\ \vdots & \vdots & & \vdots \\ t_1 I_n & t_2 I_n & \cdots & t_k I_n \end{pmatrix}\in \mathbb{R}^{nk\times nk}$$

Then the book claims that
$$E^x[(B_t-x)(B_s-x)]=n \min(s,t) \tag{1}$$
and
$$\begin{align}
E^x[(B_{t_{i}}-B_{t_{i-1}})(B_{t_{j}}-B_{t_{j-1}})]&=E^x[B_{t_{i}}B_{t_{j}}-B_{t_{i-1}}B_{t_{j}}-B_{t_{i}}B_{t_{j-1}}+B_{t_{i-1}}B_{t_{j-1}}]\\&=n(t_i-t_{i-1}-t_i+t_{i-1})=0\tag{2}
\end{align}$$

I think $(1)(2)$ should follow from the form of covariance matrix. But I don't know how to interpret $C$ and how to get the result exactly. For example, why do we have $n$ in the results?

Best Answer

First of all, by the definition of the scalar product,

$$(B_t-x) \cdot (B_s-x) = \sum_{j=1}^n(B_t^j-x^j) (B_s^j-x^j)$$

where $x=(x^1,\ldots,x^n)$ and $B_t = (B_t^1,\ldots,B_t^n)$. Without loss of generality, $s \leq t$. Choosing $t_1 = s$, $t_2 = t$, the covariance matrix of $Z:=(B_s,B_t)$ equals

$$C= \begin{pmatrix} s I_n & s I_n \\ s I_n & t I_n \end{pmatrix}. \tag{3}$$

On the other hand, by definition of the covariance matrix $C$,

$$c_{j,j+n} = \mathbb{E}^x((Z_j-M_j)(Z_{j+n}-M_{j+n})) = \mathbb{E}^x((B_s^j-x^j) (B_t^j-x^j))$$

for all $j=1,\ldots,n$; thus,

$$\mathbb{E}((B_t-x) \cdot (B_s-x)) = \sum_{j=1}^n c_{j,j+n}.$$

$(3)$ shows that $c_{j,j+n}=s$ for all $j=1,\ldots,n$; hence,

$$\mathbb{E}((B_t-x) \cdot (B_s-x)) = \sum_{j=1}^n s = n s = n \min\{s,t\}.$$


Concerning $(2)$: Note that we can rewrite $(2)$ in the following way:

$$\mathbb{E}^x \bigg[ (B_{t_i}-B_{t_{i-1}})(B_{t_j}-B_{t_{j-1}})\bigg] = \mathbb{E}^x \bigg[((B_{t_i}-x)-(B_{t_{i-1}}-x))((B_{t_j}-x)-(B_{t_{j-1}}-x)) \bigg] $$

Expand the terms and use $(1)$.