If $A,B$ are linear combinations based on common “underlying” random variables, can they still be independent

independenceprobability distributionsprobability theory

Apologies if I am just having a mental block and missing something very obvious. Here is a conjecture that I think is obviously true, and yet I cannot prove it:

Let $X_1, X_2, \ldots, X_n, Y, Z$ be mutually independent, real-valued, non-constant random variables. (They need not be identically distributed.)

Let $A = \sum_{j=1}^n a_j X_j + Y, B = \sum_{j=1}^n b_j X_j + Z$ where all the coefficients $a_j, b_j$ are non-zero.

Prove or provide counter-example:** For $X_j, Y, Z, A, B$ defined as above, $A, B$ cannot be independent.

Further thoughts: If some coefficients are zero, the subset of $X_j$'s that actually "affects" $A$ can be distinct from the subset that actually "affects" $B$, and then $A,B$ can of course be independent. But my statement explicitly rules this out.

Also, if instead of summation, we have general functions e.g. $A' = f(\vec{X}) + Y, B' = g(\vec{X}) + Z$, then even if each function $f, g$ must be affected by all components, we can still define them s.t. they are independent and therefore $A', B'$ are independent. However, I am not allowing arbitrary functions, but instead summations (linear combinations). To be clear, the summation is over reals.

(I would be curious to see a counter example in finite field, but that's not my main question, and even so, you cannot have the $+$ in $A$ be in a different field than the $+$ in $B$, so to speak.)

Best Answer

The joint distribution of two i.i.d. normal variables $X, Y$ is radially symmetric, so actually $$X \cos \theta + Y \sin \theta, \; - X \sin \theta + Y \cos \theta$$ are independent i.i.d. normal variables, by applying a rotation matrix. It should be clear how to construct a counterexample to your conjecture from here.

Related Question