I have been doing a little bit of reading regarding random processes and probability theory recently for some research I have been doing, and I have come across the claim in many places that Brownian motion cannot be treated with Riemannian integration due to the fact that it is of unbounded variation. I have been trying to find a rigorous proof for that, but I have been having a difficult time. I intuitively have the idea that since Brownian motion is considered as a continuous random walk, then it is theoretically possible for it to exceed whatever bound may be placed on it. Is this the right way of thinking of it? And can anyone produce more rigorous proof to show this? Thanks!
Brownian motion unbounded variation
brownian motionprobability theory
Related Solutions
I've taken a look at the book and this chapter is introductory and somewhat informal, so I imagine the authors are more specific about what they mean by a white noise in space or time and what they mean by the S(P)DE in your question in later chapters. Nevertheless, I have addressed aspects of your question below.
A discussion of Definitions 2, 3 and 5 are contained in an answer of mine to a similar question here. Everything in that answer is real-valued (which hopefully doesn't make too much of a difference) and indexed by a single real variable (or more precisely a test function of a single real variable); this can make a significant difference depending on what you want to know.
Definition 2
The random distribution that acts on $\phi$ via $(W, \phi) = \int \phi(t) B_t dt$ is just the Brownian motion $B$ (i.e. we can identify the function $B$ with the distribution $W$).
Your definition of $W'$ is then how I define white noise (denoted $X$) in the answer linked to above: white noise $X$ is defined as the random distribution that acts on a test function $\phi$ by $(X, \phi) = -\int_0^\infty B(t) f'(t) dt$. In the parlance of the book you cite, this is a white noise in time (time is the only variable in that answer). However, you can generalize this definition to white noise in space and time (see the discussion of Definition 3 below).
Definition 3
Here $W$ is your white noise (not $W'$ as in Definition 2).
To link this to definition 2, set $d = 0$ (so there is no spatial component to the domain of $\phi$). With $X$ defined as above, $(X_\phi := (X, \phi) : \phi \in C^\infty([0, \infty))$ is a centered Gaussian process with covariance $E(W_\phi W_\psi) = (\phi, \psi)_{L^2}$ (by the Ito isometry). The definition you have stated is a generalization to the case where the process is indexed by space and time (more precisely by test functions of space and time).
Definition 5
Your definition of $W$ is the same (by stochastic integration by parts) as the definition of $X$ above. Thus, $W$ here is once again white noise ($W'$ is then the distributional derivative of white noise).
Definition 1
In this definition, while the realization of the process you get in this way depends on the choice of basis, its (probability) distribution is independent of basis. You can think of a white noise as any process with this distribution.
This definition must be understood in the sense of distributions (now referring to Schwartz distributions) as white-noise is not defined pointwise (so $W_t$ is meaningless). A more precise definition is that $W$ acts on a test function $\phi$ by $W_\phi := (W, \phi) = \sum_{i=1}^\infty \xi_i (\phi, \phi_i)$. Now you can check that $W_\phi$ has mean $0$ and that \begin{equation} E(W_\phi W_\psi) = E\sum_{i=1}^\infty \xi_i \xi_j (\phi, \phi_i) (\psi, \phi_j) = \sum_{i=1}^\infty (\phi, \phi_i) (\psi, \phi_j) = (\phi, \psi)_{L^2}. \end{equation} Thus, the only thing to check to see that $W$ has the same distribution as the processes above is that it is Gaussian.
For a fixed $t>0$, the event $\cap_{n\in\Bbb N}\{B_{t+s}>B_t$ for some $s\in\Bbb Q\cap(0,1/n)\}$ is in $\mathcal F_{t+}$ but not in $\mathcal F_t$. ($(\mathcal F_t)_{t\ge 0}$ is the natural filtration of a Brownian motion $(B_t)$.)
Best Answer
One way to rigorously formulate what you are talking about is as follows. As a process, the sample paths of the usual Brownian motion $(B_t)_{t\in\mathbb R_+}$ are not functions of bounded variation. In fact, almost surely, the sample paths have infinite variation on any interval of the form $[0,t]$ with $t\in\mathbb R_+$.
This presents the following issue if we want to try to define what $\int_0^t X_s\,dB_s$ means, where $(X_t)_{t\in\mathbb R_+}$ is a suitable random process. In the Riemann-Stieltjes approach to integration, we can define integrals of the form $\int_0^t f(s)\,d\alpha(s)$ when $\alpha$ is, say, a function of bounded variation on $[0,t]$, and $f$ is a continuous function on $[0,t]$. When $\alpha$ does not have bounded variation, then there are continuous functions $f$ that are not Riemann-Stieltjes integrable with respect to $\alpha$.
So this means we have to look for another way to interpret what $\int_0^tX_s\,dB_s$ should mean, and the way it is treated in Le Gall's book Brownian Motion, Martingales, and Stochastic Calculus, for instance, is to proceed by building up the theory of martingales and stochastic integration enough so that we can define expressions of the form $\int_0^t X_s\,dB_s$ as stochastic processes known as stochastic integrals, which are very roughly speaking some kind of martingale-like object.
Le Gall's book is a great reference for this and other topics related to Brownian motion after one has a suitable background in probability theory.