[Math] Translation invariance of Brownian motion

brownian motionprobability

Beginner here. I'm working through Durrett's textbook's and am just getting into the section on Brownian motion. He gives a 2-line proof for a simple fact but I'm a little stuck understanding the details. Let $B_t$ be a one-dimensional Brownian motion. The simple fact is:

$\{B_t – B_0, t \geq 0\}$ is independent of $B_0$ and has the same
distribution as Brownian motion with $B_0 = 0$.

Durrett's proof: Let $\mathcal{A}_1 = \sigma(B_0)$ and $\mathcal{A}_2$ be events of the form
$$\{B(t_1)-B(t_0) \in A_1, \ldots, B(t_n)-B(t_{n-1}) \in A_n\}$$
The $\mathcal{A}_i$ are $\pi$-systems that are independent, so the desired result follows from the $\pi-\lambda$ theorem. $\blacksquare$

So I'm assuming that this hinges on $\sigma(\mathcal{A}_2) = \sigma\{B_t-B_0, t \geq 0\}$. But what's the easiest way to see this? I know it has to do with finite-dimensional distributions, but I'm getting hung up somehow.

Best Answer

Lemma: Let $X_0,\ldots,X_n$ be arbitrary random variables. Then $$\sigma(X_0,X_1-X_0,\ldots,X_n-X_{n-1}) = \sigma(X_0,X_1,\ldots,X_n).$$

Proof: The mapping $X_j-X_{j-1}$ is clearly measurable with respect to $\sigma(X_0,\ldots,X_n)$ (as difference of two measurable random variables) for all $j$. Consequently, $\sigma(X_0,X_1-X_0,\ldots,X_n-X_{n-1} \subseteq \sigma(X_0,X_1,\ldots,X_n)$. On the other hand, we can write $$X_i = \sum_{j=0}^{i-1} (X_{j+1}-X_j)+X_0;$$ this shows that $X_i$ is $\sigma(X_0,X_1-X_0,\ldots,X_n-X_{n-1})$-measurabble. This proves "$\supseteq$".


Applying this to $X_0= B(t_0)-B_0,\ldots,X_n = B(t_n)-B_0$ gives

$$\mathcal{A}_2 = \bigcup_n \sigma(B_{t_0}-B_0,\ldots,B_{t_n}-B_0)$$

where the union is taken over all $0 \leq t_0 < \ldots < t_n$ and $n \in \mathbb{N}$. Since this is $\cap$-stable generator of $\sigma(B_t-B_0; t \geq 0)$, the claim follows.

Related Question