If $A = a_1 \oplus a_2 \oplus \cdots \oplus a_m$, for $m \leq n$, where $a_i$ are row vectors of dimension $n_i$ such that $\sum_{i=1}^m n_i = n$ and $\oplus$ denotes the direct sum, then the random vector $Y$ has independent coordinates.
This is not hard to see since $Y_1$ is measurable with respect to $\sigma(X_1, \ldots X_{n_1})$, $Y_2$ is measurable with respect to $\sigma(X_{n_1+1}, \ldots, X_{n_1+n_2})$, etc., and these $\sigma$-algebras are independent since the $X_i$ are independent (essentially, by definition).
Obviously, this result still holds if we consider matrices that are column permutations of the matrix $A$ described above. Indeed, as we see below, in the case where the distribution of each $X_i$ is non-normal (though perhaps depending on the index $i$), this is essentially the only form that $A$ can take for the desired result to hold.
In the normal-distribution case, as long as $A A^T = D$ for some diagonal matrix $D$, then the coordinates of $Y$ are independent. This is easily checked with the moment-generating function.
Suppose $X_1$ and $X_2$ are iid with finite variance. If $X_1 + X_2$ is independent of $X_1 - X_2$, then $X_1$ and $X_2$ are normal distributed random variables. See here. This result is known as Bernstein's theorem and can be generalized (see below). A proof can be found in Feller or here (Chapter 5).
In the case where $A$ cannot be written as a direct sum of row vectors, you can always cook up a distribution for $X$ such that $Y$ does not have independent coordinates. Indeed, we have
Theorem (Lukacs and King, 1954): Let $X_1, X_2, \cdots, X_n$ be $n$ independently (but not necessarily identically) distributed random variables with variances $\sigma_i^2$, and assume that the $n$th moment of each $X_i(i = 1, 2, \cdots, n)$ exists. The necessary and sufficient conditions for the existence of two statistically independent linear forms $Y_1 = \sum^n_{i=1} a_i X_i$ and $Y_2 = \sum^n_{i=1} b_i X_i$ are
- Each random variable which has a nonzero coefficient in both forms is normally distributed, and
- $\sum^n_{i=1} a_i b_i \sigma^2_i = 0$.
First let us do it for $X', Y'$ such that $x_1=y_1=1$ and $x_2=y_2=0$.
Then: $$0=\mathsf{Cov}(X',Y')=\mathbb EX'Y'-\mathbb EX'\mathbb EY'=P(X'=1\wedge Y'=1)-P(X'=1)P(Y'=1)$$
This shows that the events $\{X'=1\}$ and $\{Y'=1\}$ are independent and since both random variables take a.s. at most two distinct values this implies that $X'$ and $Y'$ are independent.
For this note that $P(A\cap B)=P(A)P(B)$ implies that: $$P(A^{\complement}\cap B)=P(B)-P(A\cap B)=P(B)-P(A)P(B)=(1-P(A))P(B)=P(A^{\complement})P(B)$$
In general if $X'$ and $Y'$ are independent then so are $aX'+b$ and $cY'+d$.
From this we conclude that also $X=(x_1-x_2)X'+x_2$ and $Y=(y_1-y_2)Y'+y_2$ are independent.
Best Answer
Yes, they are independent.
If you are studying rigorous probability course with sigma-algebras then you may prove it by noticing that the sigma-algebra generated by $f_{1}(X_{1})$ is smaller than the sigma-algebra generated by $X_{1}$, where $f_{1}$ is borel-measurable function.
If you are studying an introductory course - then just remark that this theorem is consistent with our intuition: if $X_{1}$ does not contain info about $X_{2}$ then $f_{1}(X_{1})$ does not contain info about $f_{2}(X_{2})$.