Does the composition of a random variable-valued function with itself induce dependence

independenceprobability theoryrandom variables

Say I have a probability space $(\mathbb{R}, \Sigma, \mu)$ and a function $f$ of the form $f: \mathbb{R} \times \mathbb{R} \rightarrow \mathbb{R}$ such that for any distinct $x_1,x_2 \in \mathbb{R}$ the functions $f(x_1, -)$ and $f(x_2, -)$ are independent random variables over $(\mathbb{R}, \Sigma, \mu)$.

Now let's define the random variables $G$ and $H$ as:
\begin{align}
G(y) = f(f(x_1, y), y) \\
H(y) = f(f(x_2, y), y) \\
\end{align}

Are $G$ and $H$ independent?

Best Answer

Not necessarily. Here is a counter-example:

Let $\mu((-\infty, y]) = \int_{-\infty}^y \frac{1}{\sqrt{2\pi}} e^{-\frac{t^2}{2}} dt$ for all $y \in \mathbb{R}$ (the Gaussian distribution).

Define $f:\mathbb{R}^2\rightarrow\mathbb{R}$ by $$f(x,y)= \left\{ \begin{array}{ll} y &\mbox{ if $x=0$} \\ 0 & \mbox{ if $x\neq 0$} \end{array} \right.$$ Then for any distinct real numbers $x_1,x_2$ (distinct meaning $x_1\neq x_2$), at least one of the numbers $x_1,x_2$ must be nonzero. Thus, at least one of the random variables $f(x_1,y)$ and $f(x_2,y)$ must be zero for all outcomes $y \in \mathbb{R}$. Since a constant is always independent of any other random variable, $f(x_1,y)$ and $f(x_2,y)$ must be independent.

However, take $x_1=1,x_2=2$. Then for all outcomes $y \in \mathbb{R}$ we have \begin{align} G(y)&=f(f(1,y),y) = f(0,y) = y\\ H(y) &=f(f(2,y),y) = f(0,y) = y \end{align} So $G$ and $H$ are the same (Gaussian) random variable and hence they are not independent.

Related Question