Distribution and convergence in law of the discontinuous random functions

probability distributionsprobability theoryrandom variablesstochastic-analysisstochastic-processes

In the Chapter 1 of the book:

Da Prato, Zabczyk – Stochastic equations in infinite dimensions, 1992,

we could find following general definitions:

  • If $(\Omega,\mathcal{F})$ and $(E,\mathcal{S})$ are two measurable spaces, then a mapping $X$ from $\Omega$ into $E$ such that the set $\{ \omega \in \Omega : X(\omega) \in A \}=\{ X \in A \}$ belongs to $\mathcal{F}$ for arbitrary $A \in \mathcal{S}$ is called a random variable from $(\Omega,\mathcal{F})$ to $(E,\mathcal{S})$.

  • If X is a random variable from $(\Omega,\mathcal{F})$ to $(E,\mathcal{S})$ and $P$ a probability measure on $\Omega$, then by $\mathcal{L}(X)$ we will denote the image of $P$ under the mapping $X$:

$\hspace{0.8cm} \mathcal{L}(X)(A)=P\{\omega \in \Omega: x(\omega) \in A \}, \forall A \in \mathcal{S}.$

The measure $\mu=\mathcal{L}(X)$ is called the distribution or the law of $X$.

Let's say that the set $E$ consist of functions that are given with:

$$U(x,t,\omega)=
\begin{cases}
u_1, x \in A, \\[2ex]
u_2, x \notin A,
\end{cases}
$$

where $u_1,u_2$ are constants that could depend on $\omega$, and the set $A$ depends of variables $x$, $t$ and $\omega$. The easiest example I have in mind is $\forall \omega$ $A=\{x: x<t\}$ and $\forall \omega$ we have the same constants $u_1,u_2$. So I am interested in the discontinuous piecewise constant functions U.

I have two questions.

  1. Could we talk about distribution of a random variable U if the U is not continuous? If the answer is yes – and I am sure it is, what would be the law of the random variable U given above?
  2. Could we talk about convergence in the law of random variables $U_n$ to the random variable U, if the limit function is similar to the U given above (i.e. discontinuous) and the $U_n$ are some continuous or smooth random variables – even if they are not on the same spaces? When I say space I mean some Banach space-valued spaces such as: $L^p(\Omega;BV(\mathbb{R}^d))$, $L^p(\Omega;L^{\infty}(\mathbb{R}^d))$, $L^p(\Omega;C([0,T],L^{\infty}(\mathbb{R}^d)))$ and not the probability space. I assume that the answer is yes also but I need a little more details.

I hope that I am not asking the obvious things. Thank you all for the help in advance.

Best Answer

Yes, you can talk about the distribution of a non-continuous random variable. I think you are mistaken when you say "In my case set $E$ is some set of piecewise constant functions." Your random variables take on real values, so your $E$ is $\mathbb{R}$ or some subset of it. In any case, the measure $\mu$ associated with your $U$ is given by $$\mu(S)= \begin{cases} 0 & u_1,u_2 \notin S \\ P(U=u_1) & u_1 \in S, u_2 \notin S \\ P(U=u_2) & u_1 \notin S, u_2 \in E \\ 1 & u_1,u_2 \in S. \end{cases}$$

Yes, you can talk about convergence of random variables that are not on the same probability space. The Kolmogorov extension theorem essentially guarantees that you can build one probability space on which all of your random variables live and such that their joint distributions are all consistent.

Related Question