Applying Girsanov Theorem to Ornstein-Uhlenbeck Process

brownian motionprobability theorystochastic-processes

Let $W_t$ be a standard Brownian motion, and $X_t$ a measurable adapted process. Girsanov's theorem says that under certain conditions, the Brownian motion with drift $Y_t = W_t – \int_0^t X_s\,ds$ can be a Brownian motion under a certain equivalent probability measure.

I want to apply Girsanov's theorem with $X_t$ an Ornstein-Uhlenbeck process defined by $dX_t = dW_t – X_t dt$, $X_0 = 0$. In this case we would have $Y_t = X_t$, so I would learn that an Ornstein-Uhlenbeck process can be a Brownian motion under an equivalent measure.

The condition needed for Girsanov's theorem to hold is that
$$Z_t = \exp\left(\int_0^t X_s\,dW_s – \frac{1}{2} \int_0^t X_s^2\,ds\right)$$
be a martingale.

Is this condition satisfied?

A sufficient condition, due to Novikov, is that
$$E \exp\left(\frac{1}{2} \int_0^T X_s^2\,ds\right) < \infty.$$
I can't seem to see how to verify either of these conditions, though the Ornstein-Uhlenbeck process has so many nice properties that one would think something simple would work.

This question came up while studying the solution of the quantum harmonic oscillator via the Feynman-Kac formula. I am trying to understand the "ground state transformation" in terms of Girsanov's formula.

Thanks!

Best Answer

@Nate: I think your argument is fine. I didn't know the Ferinique's theorem. My argument on that part was this one.

Fix $T>0$ (as you did eventually $T+1$). Applying Jensen we get: $$\exp\left(\frac{1}{2}\int_S^{S+\epsilon} X_t^2\,dt\right)=\exp\left(\frac{1}{\epsilon}\int_S^{S+\epsilon} \frac{\epsilon}{2}X_t^2\,dt\right)\leq \frac{1}{\epsilon}\int_S^{S+\epsilon} \exp\left(\frac{\epsilon}{2}X_t^2\right)\,dt \qquad a.s..$$

Now taking the expectation and using Tonelli's theorem we have to study: $$\frac{1}{\epsilon}\int_S^{S+\epsilon} E\left[\exp\left(\frac{\epsilon}{2}X_t^2\right)\right]\,dt.$$

$X_t \sim N(\mu_t=X_0e^{-t}\,,\,\,\sigma_t^2=\frac{1-e^{-2t}}{2})$ so $$E\left[\exp\left(\frac{\epsilon}{2}X_t^2\right)\right]=E\left[\exp\left(\frac{\epsilon}{2}(\mu_t+\sigma_tZ)^2\right)\right]=$$ $$=e^{\frac{\epsilon}{2}\mu_t^2}\int_{\mathbb{R}} \frac{1}{\sqrt{2 \pi}}\, \exp\left(-\frac{x^2}{2}(1-\sigma_t^2 \epsilon)+\epsilon \mu_t \sigma_t x\right)dx. $$ Setting $\lambda_t=1-\epsilon \sigma_t^2=\frac{1}{2}[2-\epsilon(1-e^{-2t})]$, if $\lambda_t>0$ (for example with $\epsilon < 1$) the last integral is convergent and its value is:$$\exp\left(\frac {\epsilon}{2}\mu_t^2\right)\,\exp\left(\frac{\epsilon^2}{2 \lambda_t}\mu_t^2 \sigma_t^2\right)\frac{1}{\sqrt{\lambda_t}}.$$

Finally all the functions involved are continuous and since $\epsilon < 1$, $\lambda_t$ is away from 0 and so the moment generating function is integrable.

My first idea was to use instead of $\epsilon$ $T$, but moment generating function of the chi squared is not define to away from 0 but only in a neighbourhood.