[Math] $\int_0^tB_s^2\ dB_s$ – Gaussian Process and independent increments

brownian motionnormal distributionstochastic-integralsstochastic-processes

For $(B_t)_{t\ge0}$ a standard Brownian motion (Wiener process) define the stochastic process $X_t:=\int_0^tB_s^2\ dB_s$. I am currently trying to assess if $(X_t)_{t\ge0}$ is a Gaussian process and if $(X_t)_{t\ge0}$ has independent increments.

I have already shown that $X_t$ is a martingale with $\mathbb E[X_t]=0$ and var$(X_t)=\mathbb E[X_t^2]=t^3$ (for var I used Ito’s isometry). Now my guess is that $(X_t)_{t\ge0}$ is not a Gaussian process and therefore I am trying to show that the moments of $X_t$ do not match the ones of a Gaussian distribution. So, for example, if I could show that $\mathbb E[X_t^4]\ne 3\cdot \text{var}(X_t)^2$, then this would prove that $X_t$ is not Gasussian, but I don't really know how. As for the increments, I don't really have a starting point either.

Best Answer

The process $X$ is not gaussian and its increments are not independent.

Note first that $X$ is a Brownian martingale, hence a Brownian motion with a change of time, thus, it is distributed like $(\beta_{\langle X\rangle_t})$, where $\beta$ is a Brownian motion independent of $X$. For example, $X_1$ has the distribution of $\beta_{\langle X\rangle_1}=\sqrt{\alpha}\cdot\gamma$ where $\gamma$ is standard normal independent of $(X_t)$ and $\alpha=\langle X\rangle_1$. Thus, $E[X_1]=0$, $E[X_1^2]=E[\alpha]\cdot E[\gamma^2]=E[\alpha]$ and $E[X_1^4]=E[\alpha^2]\cdot E[\gamma^4]=3E[\alpha^2]$.

Since $E[Z^4]=3E[Z^2]^2$ for every centered normal random variable $Z$, these remarks show that if $X_1$ is normal then $E[\alpha^2]=E[\alpha]^2$, that is, $\alpha$ is almost surely constant. But $\alpha=\int\limits_0^1B_t^4\,\mathrm dt$ hence this is not so and $X_1$ is not normal.


To study the independence of the increments of $X$, fix some $s\geqslant0$ and consider the sigma-algebras $\mathcal F^X_s=\sigma(X_u;u\leqslant s)$ and $\mathcal F^B_s=\sigma(B_u;u\leqslant s)$, and the Brownian motion $C$ defined by $C_u=B_{s+u}-B_s$ for every $u\geqslant0$. Then $C$ is independent of $\mathcal F^B_s$. Furthermore, for every $t\geqslant0$, $$ X_{t+s}=X_s+\int_0^t(B_s+C_u)^2\mathrm dC_u=X_s+B_s^2C_t+2B_s\int_0^tC_s\mathrm dC_s+\int_0^tC_s^2\mathrm dC_s. $$ Rewrite this as $$ X_{t+s}-X_s=B_s^2C_t+B_sD_t+G_t, $$ where $D_t$ and $G_t$ are functionals of $C$ hence independent of $\mathcal F^B_s$. Thus, $$ E[(X_{t+s}-X_s)^2\mid\mathcal F^B_s]=B_s^4E[C_t^2]+B_s^2E[D_t^2]+E[G_t^2]+2B_s^3E[C_tD_t]+2B_s^2E[C_tG_t]+2B_sE[D_tG_t]. $$ One can check that $E[C_tD_t]=E[D_tG_t]=0$, $E[C_t^2]=t$, $E[D_t^2]=2t^2$, $E[G_t^2]=t^3$ and $E[C_tG_t]=\frac12t^2$ hence $$ E[(X_{t+s}-X_s)^2\mid\mathcal F^B_s]=tB_s^4+3t^2B_s^2+t^3. $$ Note that $\mathrm d\langle X\rangle_s=B_s^4\mathrm ds$ and that $\langle X\rangle$ is $\mathcal F^X$-adapted hence $B_s^4$ and every function of $B_s^4$, for example $B_s^2$, are measurable with respect to $\mathcal F^X_s$. This yields $$ E[(X_{t+s}-X_s)^2\mid\mathcal F^X_s]=tB_s^4+3t^2B_s^2+t^3. $$ The RHS is not almost surely constant hence $(X_{t+s}-X_s)^2$ is not independent of $\mathcal F^X_s$, in particular the increments of $X$ are not independent.

Edit: One may feel that the computation of the conditional expectation of $(X_{t+s}-X_s)^2$ above is rather cumbersome (it is) and try to replace it by the (definitely simpler) computation of the conditional expectation of $X_{t+s}-X_s$. Unfortunately, $$ E[X_{t+s}-X_s\mid\mathcal F^X_s]=0, $$ hence this computation is not sufficient to decide whether the conditional distribution of $X_{t+s}-X_s$ conditionally on $\mathcal F^X_s$ is constant or not (which is the reformulation of the independence of a random variable and a sigma-algebra this solution relies on). Another way of looking at the situation is that, fortunately, already the conditional second moments are not constant.