Brownian Motion Conditional Characteristic function

brownian motionprobability theorystochastic-analysisstochastic-calculusstochastic-processes

I am trying to prove that a one dimensional continuous process $B=(B_t)_{t \geq 0}$ with $B_0 =0$ is a standard Brownian motion if and only if for $\epsilon \in \mathbb{R}$ and $t > s$:

$$
\mathbb{E}[e^{i\xi (B_t-B_s)}| \mathcal{F}_s] = e^{-\frac{(t-s)|\xi|^2}{2}}
$$

where $\mathcal{F}_s$ is the natural filtration for $B$.

So far I have done the following:

First direction:

Since $B$ is a Brownian motion, $(B_t-B_s) \sim N(0,t-s)$ and $(B_t-B_s)$ is independent of $\mathcal{F}_s$ by the independence of increments. Thus

$$
\mathbb{E}[e^{i\xi (B_t-B_s)}| \mathcal{F}_s] = \mathbb{E}[e^{i\xi (B_t-B_s)}]
$$

Computing the expectation directly we get

$$
\mathbb{E}[e^{i\xi (B_t-B_s)}] = \int^{\infty}_{-\infty} \frac{1}{\sqrt{2\pi (t-s)}} e^{i \xi x} e^{-\frac{x^2}{2(t-s)}} dx
$$

Combining the exponential terms this gives

$$
\mathbb{E}[e^{i\xi (B_t-B_s)}] = \int^{\infty}_{-\infty} \frac{1}{\sqrt{2\pi (t-s)}} e^{i \xi x -\frac{x^2}{2(t-s)}} dx
$$

$$
= \int^{\infty}_{-\infty} \frac{1}{\sqrt{2\pi (t-s)}} e^{-\frac{1}{2(t-s)}(x^2-2i(t-s)\xi x)} dx
$$

Completing the square gives

$$
\mathbb{E}[e^{i\xi (B_t-B_s)}] = \int^{\infty}_{-\infty} \frac{1}{\sqrt{2\pi (t-s)}} e^{-\frac{1}{2(t-s)}((x-i(t-s)\xi )^2-(i(t-s)\xi)^2)} dx
$$

$$
= \int^{\infty}_{-\infty} \frac{1}{\sqrt{2\pi (t-s)}} e^{-\frac{1}{2(t-s)}(x-i(t-s)\xi )^2}e^{\frac{(i(t-s)\xi)^2)}{2(t-s)}}dx
$$

$$
= \int^{\infty}_{-\infty} \frac{1}{\sqrt{2\pi (t-s)}} e^{-\frac{1}{2(t-s)}(x-i(t-s)\xi )^2}e^{-\frac{(t-s)\xi^2}{2}}dx
$$

$$
= e^{-\frac{(t-s)\xi^2}{2}} \int^{\infty}_{-\infty} \frac{1}{\sqrt{2\pi (t-s)}} e^{-\frac{1}{2(t-s)}(x-i(t-s)\xi )^2}dx
$$

Thus it remains to show that the integral is equal to $1$. Is the justification for this that it is the integral of the PDF of a normally distributed random variable with variance $t-s$ and mean $i(t-s)\xi$? Or is this not allowed?

Now for the reverse direction, since the RHS is deterministic, we can take the expectation on both sides to get that $\mathbb{E}[\mathbb{E}[e^{i\xi (B_t-B_s)}| \mathcal{F}_s]] = \mathbb{E}[e^{i\xi (B_t-B_s)}] = e^{-\frac{(t-s)|\xi|^2}{2}}$. Thus we have that the characteristic function of $(B_t – B_s)$ is that of a Gaussian random variable of mean $0$ and variance $t-s$. Hence, $(B_t-B_s) \sim N(0,t-s)$.

Since we are given that $B$ is a continuous stochastic process, it only remains to show that it has independent increments. I'm not really sure how to prove this. My intuition is that since $\mathbb{E}[e^{i\xi (B_t-B_s)}| \mathcal{F}_s] $ is a deterministic quantity, it follows that $e^{i\xi (B_t-B_s)}$ is independent from $\mathcal{F}_s = \sigma(\bigcup_{r\leq s}B_r)$. Since $e^{i\xi (B_t-B_s)}$ is a function of only $(B_t-B_s)$ (this is the only random variable involved), it follows that $(B_t-B_s)$ is independent of $\mathcal{F}_s$. Thus $(B_t-B_s)$ is independent of $B_r$ for all $r \leq s$. But I'm not sure this is correct / enough.

Thanks

Best Answer

As mentioned in the comments, you can use analytic continuation, or even just contour integration works (and is a common example in a complex analysis class). The result is very standard (as this is the characteristic function of a Gaussian), so you can find a proof by searching for course notes (and probably also this website).

For the second part, showing independent increments requires us to show that for any choice $t_0 < t_1 < \cdots < t_m,$ where $m$ is finite, if $Y_n := (B_{t_n} - B_{t_{n-1}}),$ then $(Y_1,\cdots, Y_m)$ are mutually independent. For Brownian motion, the condition you have turns out to be sufficient to establish joint normality of these $Y$s, from which we can read off independence because the covariance turns out to be diagonal. I'll execute this below.

For any $\lambda \in \mathbb{R}^m,$ by the tower rule, $$ \mathbb{E}[ \exp(i \sum_{n = 1}^m \lambda_n Y_n)] = \mathbb{E}[\mathbb{E}[\exp(i \sum_{n = 1}^m \lambda_n Y_n)|\mathscr{F}_{t_{m-1}}]].$$ Since $Y_1,\cdots, Y_{m-1}$ are $\mathscr{F}_{m-1}$-measurable, and given the characteristic function condition, this yields $$ \mathbb{E}[\exp(i \sum_{n = 1}^m \lambda_n Y_n)|\mathscr{F}_{t_{m-1}}] = \exp(i \sum_{n = 1}^{m-1} \lambda_n Y_n) \cdot \exp(- \lambda_m^2 (t_m - t_{m-1})/2),$$ or plugging back in, $$ \mathbb{E}[ \exp(i \sum_{n = 1}^m \lambda_n Y_n)] = \mathbb{E}[ \exp(i \sum_{n = 1}^{m-1} \lambda_n Y_n)] \exp(- \lambda_m^2 (t_m - t_{m-1})/2).$$ Now iterate to conclude that the characteristic function of $(Y_1, \cdots, Y_n)$ is $\exp(- \lambda^\top \Sigma \lambda/2)$ where $\Sigma = \mathrm{diag}(t_1 - t_0, t_2 - t_1, \cdots, t_m - t_{m-1}).$ This is the characteristic function of a multivariate Gaussian with covariance $\Sigma$. Since $\Sigma$ is diagonal, all of these are uncorrelated, and since they are Gaussian, they are thus mutually independent.