On the notion of convergence in probability theory

probabilityprobability theorystochastic-processes

Any notion of convergence for random variables is defined using discrete sequences $X_n$. For example, $ \displaystyle \{X_n\}_{n\in\mathbb{N}}$ is said to converge in probability to $X$ if

\begin{equation}\tag{1}\lim_{n\to\infty}P\left[|X_n-X|>\epsilon\right]=0,\; \forall \epsilon>0
\end{equation}

A stochastic process, on the other hand, is a family $\displaystyle \left\{X_t(\omega)|\omega\in\Omega\; t\in\mathbb{R}^+\right\} $ of time-dependent random variables. We say $X_t$ is continuous in probability if
\begin{equation}\tag{2}
\lim_{\Delta t\to0}P[|x(t+\Delta t)-x(t)|>\epsilon]=0 \;\forall \epsilon >0
\end{equation}

Is this last definition consistent with $(1)$? What plays the role of $X_n$?

What I'm trying to say is there should be a difference between the convergence of a stochastic process converges and that of a random sequence, in the same way the convergence of a numerical sequence differs from the convergence of functions. In functional spaces, there's pointwise, uniform convergence and so on…

But I never read of Stochastic processes converging in probability (or almost surely, or in the r-th mean) uniformly or pointwise. They converge with the same notion used for simple sequences of random variables.

To make another example, hat does it mean for $X_t$ to be almost-surely continuous? It is intended "poinwise", in the sense that for ever fixed $\omega$ the sample $x_\omega(t)$ is almost everywhere continuous (w.r.t. Lebesgue measure), or "uniformly" in the sense that for each fixed point $t$, $\lim_{\Delta t\to 0}(x_\omega(t+\Delta t)-x_\omega(t))=0$ for almost all $\omega\in\Omega$?

Best Answer

You can rephrase $(2)$ in terms of $(1)$ if that is what you mean by being consistent. A stochastic process $(X_t : \Omega \to \mathbb R)_{t \in \mathbb R_+ }$ is continuous in probability at $t$ if and only if each discrete sequence $(X_{t + \Delta t_n})_{n \in \mathbb N}$ converges in probability to $X_t$ for every sequence $(\Delta t_n \in \mathbb R_+)_{n \in \mathbb N }$ s.t. $\Delta t_n \to 0$.

Indeed for each fixed $\epsilon > 0$, $p_\epsilon(\Delta t) = P[|X_{t + \Delta t} - X_t| > \epsilon]$ is just a real function $p_\epsilon : \mathbb R_+ \to \mathbb R_+$ of $\Delta t$. So the above is just a reformulation of the sequential criterion for limits.

Related Question