SDE – Understanding Blow Up Limits

stochastic-calculusstochastic-differential-equationsstochastic-processes

Let $W$ be a one dimensional standard Brownian motion, and let $X$ be the solution to the SDE

$$dX_t = \sigma(X_t) \, dW_t \, , \, X_0 = 0$$

with $\sigma: \mathbb R \to \mathbb R$ Lipschitz continuous.

For each $c > 0$, define the process $Y^c$ on $[0, 1]$ by

$$Y^c_t := c^{-1/2} X_{ct}$$

Question: Is it true that as $c \to 0$, $Y^c$ converges in law to $\sigma(0) B_t$ for a Brownian motion $B$?

Best Answer

The answer to your question is yes, at least for the one- dimensional marginals. That said, I'm confident that the convergence you want actually holds on the level of stochastic processes (i.e. $c^{-1/2}X_{ct}\xrightarrow{(d)}B_t$ as functions on $C[0,T]$ for any $T>0$), but this requires some additional work. That said, here's the argument that proves convergence in distribution of the one dimensional marginals.

Suppose $(X_{t})_{t\geq{0}}$ solves the stochastic differential equation: $$ dX_{t}=\sigma(X_{t})dB_{t}, \hspace{5pt} X_{0}=0 $$ where $\sigma$ is Lipschitz with Lipschitz constant $K>0$. That is $|\sigma(x)-\sigma(y)|\leq{K|x-y|}$ for all $x,y\in{\mathbb{R}}$. First, we will show that $\sup_{t\in{[0,T]}}\mathbb{E}X_{t}^{2}<\infty$ for all $T>0$.

To this effect, consider the stopping time $T_{n}:=\inf\{t\geq{0}:|X_{t}|=n\}$. For each $n\in{\mathbb{N}}$ we define the function $f_{n}:[0,\infty)\rightarrow{\mathbb{R}}$ as follows: $$ f_{n}(t)=\mathbb{E}X_{t\wedge{T_{n}}}^{2} $$ Since $|X_{t\wedge{T_{n}}}|\leq{n}$ by the definition of $T_{n}$, $|f_{n}(t)|\leq{n^{2}}$ for all $t>0$. By Ito's formula applied to $X_{t}$, $$ X_{t\wedge{T_{n}}}^{2}=\int_{0}^{t\wedge{T_{n}}}\sigma(X_{s})X_{s}dB_{s}+\int_{0}^{t\wedge{T_{n}}}\sigma^{2}(X_{s})ds $$
Taking expectations we have that: $$ \mathbb{E}X_{t\wedge{T_{n}}}^{2}=\mathbb{E}\int_{0}^{t\wedge{T_{n}}}\sigma^{2}(X_{s})ds\leq{\int_{0}^{t}\mathbb{E}\sigma^{2}(X_{s\wedge{T_{n}}})ds} $$
Using the fact that $\sigma$ is Lipschitz we have that for any $x\in{\mathbb{R}}$, $|\sigma(x)|\leq{|\sigma(0)|+ K|x|}$.This in turn implies that $\sigma^{2}(x)\leq{2\sigma^{2}(0)+2K^{2}|x|^{2}}$. Applying this to the inequality above gives us: $$ \mathbb{E}X_{t\wedge{T_{n}}}^{2}\leq{\int_{0}^{t}\big(2\sigma^{2}(0)+2K\mathbb{E}X_{s\wedge{T_{n}}}^{2}\big)ds}=2\sigma^{2}(0)t+2K^{2}\int_{0}^{t}\mathbb{E}X_{s\wedge{T_{n}}}^{2}ds $$ In other words, for all $n\in{\mathbb{N}}$ and $t\in{[0,\infty)}$, $f_{n}$ satisfies the following inequality: $$ f_{n}(t)\leq{2\sigma^{2}(0)t+2K^{2}\int_{0}^{t}f_{n}(s)ds} $$ By a Gronwall- type argument, it follows that for all $n\in{\mathbb{N}}$ and $t\in{[0,\infty)}$ we have that: $$ f_{n}(t)\leq{\frac{\sigma^{2}(0)}{K^{2}}\left(e^{2K^{2}t}-1\right)} $$ By Fatou's lemma, $$ \mathbb{E}X_{t}^{2}\leq{\liminf_{n\rightarrow{\infty}}\mathbb{E}X_{t\wedge{T_{n}}}^{2}}\leq{\frac{\sigma^{2}(0)}{K^{2}}\left(e^{2K^{2}t}-1\right)} $$ Hence, we have that $\mathbb{E}X_{t}^{2}\leq{\frac{\sigma^{2}(0)}{K^{2}}\left(e^{2K^{2}t}-1\right)}<\infty$ for all $t>0$. From here, the proof is routine. We're interested in the process $Y^{c}_{t}=c^{-1/2}X_{ct}$ where $c>0$ is small. $$ X_{ct}=\int_{0}^{ct}\sigma(X_{s})dB_{s} $$ By Brownian scaling we have that $(c^{-1/2}B_{ct})_{t\geq{0}}\overset{d}{=}(\widetilde{B}_{t})_{t\geq{0}}$ where $(\widetilde{B}_{t})_{t\geq{0}}$ is a Brownian motion. Hence, "changing variables" in the stochastic integral above, we have that: $$ X_{ct}=c^{1/2}\int_{0}^{t}\sigma(X_{cu})d\widetilde{B}_{u} $$ Fix $\varepsilon>0$. Then: \begin{align*} \mathbb{E}(c^{-1/2}X_{ct}-\sigma(0)\widetilde{B}_{t})^{2}&=\mathbb{E}\left(\int_{0}^{t}\left(\sigma(X_{cu})-\sigma(0)\right)d\widetilde{B}_{u}\right)^{2}=\int_{0}^{t}\mathbb{E}(\sigma(X_{cu})-\sigma(0))^{2}du \\ &\leq{K^{2}\int_{0}^{t}\mathbb{E}X_{cu}^{2}du} \leq{K^{2}\varepsilon^{2}t+K^{2}\int_{0}^{t}\mathbb{E}X_{cu}^{2}1_{(|X_{s}|>\varepsilon \hspace{2pt} \text{for some $s\in{[0,ct]}$})}du} \end{align*} Since the process $(X_{t})_{t\geq{0}}$ is continuous and square integrable, the second term on the last line goes to $0$ as $c$ tends to $0$. Since $\varepsilon>0$ is arbitrary, we conclude that for any $t>0$, $$ c^{-1/2}X_{ct}\xrightarrow{(d)}\sigma(0)B_{t} \hspace{5pt} \text{as $c\rightarrow{0}$} $$ Bonus: It turns out "upgrading" our argument to prove that: $$ c^{-1/2}X_{ct}\xrightarrow{(d)}\sigma(0)B_t \hspace{5pt} \text{as functions on $C[0,T]$ for all $T>0$} $$
is not that hard. Observe that it suffices to prove two things:

  1. Convergence of the finite- dimensional marginals of the processes $(c^{-1/2}X_{ct})_{t\geq{0}}$ to those of $(\sigma(0)B_{t})_{t\geq{0}}$.

  2. Precompactness of the measures on $C[0,T]$ induced by the processes $(c^{-1/2}X_{ct})_{t\geq{0}}$.

The argument for (1) is analogous to the one- dimensional case that I've written out above, so I won't bother writing it out in detail. For precompactness, we make use of a neat theorem from Billingsley's "Convergence of Probability Measures." Namely, theorem 8.3 of this book gives us a condition for a collection of measures $(\mathbb{P}_{n})_{n\geq{1}}$ on $C[0,T]$ to be precompact:

Theorem: A sequence of measures $(\mathbb{P}_{n})_{n\geq{1}}$ on $C[0,T]$ is tight if:

  1. For any $\varepsilon>0$ there exist $M>0$ and $n_{0}\in{\mathbb{N}}$ such that for all $n\geq{n_{0}}$, $$ \mathbb{P}_{n}(\{x:|x(0)|\geq{M}\})\leq{\varepsilon} $$
  2. For all $\varepsilon>0$ there exists $\delta>0$ and $n_{0}\in{\mathbb{N}}$ such that for all $n\geq{n_{0}}$, $$ \delta^{-1}\mathbb{P}_{n}(\{x:\max\limits_{t\leq{s}\leq{t+\delta}}|x(s)-x(t)|\geq{\varepsilon}\})\leq{\varepsilon} $$ for every $t\in{[0,T]}$.

Observe that the processes $Y^{c}_{t}=c^{-1/2}X_{ct}$ are all $0$ at $0$, so the first criteria is satisfied. For the second criteria, repeating our Gronwall- type argument for the 4th moment in place of the 2nd moment tells us that: $$ C(T):=\max_{t\in{[0,T]}}\mathbb{E}X_{t}^{4}<\infty $$ for any $T>0$. Using a Tchebyshev- type argument we have: \begin{align*} \mathbb{P}(\max\limits_{t\leq{s}\leq{t+\delta}}|c^{-1/2}X_{cs}-c^{-1/2}X_{ct}|\geq{\varepsilon})&=\mathbb{P}\left(\max\limits_{t\leq{s}\leq{t+\delta}}|\int_{t}^{t+s}\sigma(X_{cr})d\widetilde{B}_{r}|\geq{\varepsilon}\right) \\ &\leq{\frac{\mathbb{E}\Big(\max\limits_{t\leq{s}\leq{t+\delta}}|\int_{t}^{t+s}\sigma(X_{cr})d\widetilde{B}_{r}|^{4}\Big)}{\varepsilon^{4}}} \\ &\leq{C\hspace{2pt}\mathbb{E}\Big(\int_{t}^{t+\delta}\sigma(X_{cr})d\widetilde{B}_{r}\Big)^{4}\varepsilon^{-4}} \\ &\leq{C'\hspace{2pt}\mathbb{E}\Big(\int_{t}^{t+\delta}\sigma^{2}(X_{cr})dr\Big)^{2}\varepsilon^{-4}} \end{align*} where $C,C'>0$ are absolute constants. The inequality on the third line follows by Doob's inequality, using the fact that the process $\Big(\int_{t}^{t+s}\sigma(X_{cr})\widetilde{B}_{r}\Big)_{s\geq{0}}$ is a martingale. The inequality on the fourth line follows by applying the Burkholder- Davis- Gundy inequality (https://almostsuremath.com/2010/04/06/the-burkholder-davis-gundy-inequality/). Using the fact that $\sigma$ is Lipschitz, $$ \int_{t}^{t+\delta}\sigma^{2}(X_{cr})dr\leq{\int_{t}^{t+\delta}\big(2\sigma^{2}(0)+2K^{2}X_{cr}^{2}\big)dr}\leq{2\delta\sigma^{2}(0)+2\delta{K^{2}}\max_{t\leq{r}\leq{t+\delta}}X_{cr}^{2}dr} $$ $$ \Big(\int_{t}^{t+\delta}\sigma^{2}(X_{cr})dr\Big)^{2}\leq{8\delta^{2}\sigma^{4}(0)+8\delta^{2}{K^{4}}\max_{t\leq{r}\leq{t+\delta}}X_{cr}^{4}dr} $$ Thus: \begin{align*} \mathbb{P}(\max\limits_{t\leq{s}\leq{t+\delta}}|c^{-1/2}X_{cs}-c^{-1/2}X_{ct}|\geq{\varepsilon})\leq{8C'\varepsilon^{-4}\delta^{2}\Big(\sigma^{4}(0)+K^{4}\mathbb{E}\max_{t\leq{r}\leq{t+\delta}}X_{cr}^{4}\Big)} \end{align*} Finally, since $(X_{t})_{t\geq{0}}$ is a martingale, by Doob's inequality, $$ \mathbb{E}\big(\max_{r\in{[t,t+\delta]}}X^{4}_{cr}\big)\leq{\mathbb{E}\big(\max_{r\in{[0,T]}}X^{4}_{r}\big)}\leq{C\hspace{2pt}\mathbb{E}X_{T}^{4}}\leq{C\cdot{C(T)}} $$ Thus, we see that taking $\delta<C''\varepsilon^{4}$ for some constant $C''>0$ we have that: $$ \mathbb{P}(\max\limits_{t\leq{s}\leq{t+\delta}}|c^{-1/2}X_{cs}-c^{-1/2}X_{ct}|\geq{\varepsilon})\leq{\delta\cdot{\varepsilon}} $$ for any $t\in{[0,T]}$. This completes our proof.