Stochastic Processes – Showing Independence of Two Gaussian Processes

normal distributionrandom variablesstochastic-processes

Say that $Z_t = (X_t, Y_t)$ is a 2-dimensional Gaussian process (by definition, it means that the random vector $(X_{t_1},Y_{t_1},…,X_{t_n},Y_{t_n})$ is a Gaussian random vector for all $t_1 ,…,t_k$.

I want to prove that if $E(X_s Y_t)=0$ for all $s,t$, then two processes $X_t$ and $Y_t$ are independent.

For this, it seems likely that if a random variable $X$ is independent with some family (can be uncountable) of random variables $Y_t$, $X$ is also independent with the $\sigma$-algebra generated by all $Y_t$.

Is this necessarily true? If not, how do we prove the above statement?

Best Answer

By definition, two stochastic processes $(X_t)_{t \geq 0}$ and $(Y_t)_{t \geq 0}$ are independent iff their corresponding canonical filtrations are independent, i.e.

$$\mathcal{F}^X := \sigma(X_s; s \geq 0) \quad \text{and} \quad \sigma(Y_s; s \geq 0) =: \mathcal{F}^Y .$$

are independent. From

$$\mathcal{F}^X = \sigma \bigg( \underbrace{\bigcup_{n \in \mathbb{N}} \bigcup_{0<t_1<\ldots<t_n} \sigma(X_{t_1},\ldots,X_{t_n})}_{=:\mathcal{G}^X} \bigg)$$

and the fact that $\mathcal{G}^X$ is a generator stable under intersections, we conclude that it suffices to show that $X:=(X_{t_1},\ldots,X_{t_n})$ and $Y:=(Y_{t_1},\ldots,Y_{t_n})$ are independent for any $t_1<\ldots<t_n$. By assumption, $(X,Y)$ is a Gaussian random vector and $X$ and $Y$ are Gaussian as well. Recall that Gaussian random vectors are independent if and only if their components are uncorrelated. Thus,

$$\text{cov}(X_s,Y_t) = \mathbb{E}((X_s-\mathbb{E}X_s) \cdot (Y_t-\mathbb{E}Y_t))=0$$

implies the independence of the processes. If $\mathbb{E}X_s = \mathbb{E}Y_t=0$, the latter condition is equivalent to

$$\mathbb{E}(X_s \cdot Y_t) = 0$$

Related Question