If two random variables $X_1$ and $X_2$ are dependent then must $X_1^2$ and $X_2^2$ be dependent

independenceprobabilityprobability theory

If two random variables $X_1$ and $X_2$ are dependent then $X_1^2$ and $X_2^2$ be dependent.

I believe this statement to be false. Considering that $X_1$ and $X_2$ being dependent implies

$\sigma(X_1)$ is dependent of $\sigma(X_2)$ that is the sigma algebras generated by each rv are dependent, but since $\sigma(X_1^2)\subset \sigma(X_1)$ and $\sigma(X_2^2)\subset \sigma(X_2)$ the reduction could potentially lead to independent sigma algebras.

The counter example I came up with is

let:

$X_1\sim \text{Unif}(0,1)$ and
$$
X_2|X_1 =
\begin{cases}
1 & X_1\in[0,\frac{1}{2})\\
-1 & X_1\in[\frac{1}{2},1]\\
\end{cases}$$

Note these two random variables are highly dependent but when I square both
$X_1\sim \frac{1}{2\sqrt{x_1}}$ and $X_1|X_1=1$ thus the two squared random variables are independent. Is this counterexample sound?

Best Answer

Your counter-example works, thought since your $X_2^2$ is constant it is not very revealing, as it is independent of everything

Another might be to have $A$ and $B$ independently standard normal (mean $0$, variance $1$) and

$X_1=A$ while $X_2=\text{sign}(A)\, |B|$.

Then $X_1$ and $X_2$ are positively correlated normal distributions while $X_1^2$ and $X_2^2$ are independent chi-squared distributions

Related Question