Let $X_{i}, Y_{i}: \Omega \to \mathbb{R} \cup \{+ \infty, – \infty \} , i \in \mathbb{N}$ two families of random variables, which are all *independent* to each other, ie $X_i$ and $Y_i$ are pairwise independent, but for eny $i, j$ the $X_i$ and $Y_j$ are independent as well.

Consider two new random variables, two infinite sums $S_X:=\sum_{i =1}^{\infty}X_i,

S_Y:=\sum_{i =1}^{\infty}Y_i$.

Question: How to show that $S_X$ and $S_Y$ are independent as well, ie $P_{S_X, S_Y}= P_{S_X} \cdot P_{S_Y}$?

Firstly, the finite case: If random variables $X, Y, Z$ are

independent, are then $X+Y$ and $Z$ are also independent?

(almost) idea: For the distribution of $X+Y$

have the convolution $P_{X+Y}= P_{X*Y} = P_X * P_Y$ and therefore

$$ P_{X+Y}(a)= \int_{\mathbb{R}} P_X(x) \cdot P_Y(a-x) d \lambda(x) $$

How can we deal with $P_{X+Y, Z}= P_{Y*Y, Z}$? Can it somehow be

pulled into the integral? If we can do it, the independence is

reduced to independence of $X, Y, Z$ and we are done. But I not see

any reason why we can do something like this:

$$ P(a \in X+Y, b \in Z)= P_{X+Y, Z}(a,b)=

\int_{\mathbb{R}} P_{X,Z}(x, b) \cdot P_{Y,Z}(a-x, b) d \lambda(x) $$

If we can show that $X+Y$ and $Z$ are independent, we can proceed inductively,

and such that finite sums $S^n_X:=\sum_{i =1}^{n}X_i,

S^m_Y:=\sum_{i =1}^{m}Y_i$ are independent. Can we then use a limit

argument to pass to $S_X$ and $S_Y$?

In summary:

*Problem 1:* If RV $X, Y, Z$ are

independent, why are then $X+Y$ and $Z$also independent?

*Problem 2:* If $S_X^n, S_Y^m$ are inddpendent for all $n,m$, why are $S_X$ and $S_Y$ also

independent?

## Best Answer

Problem1:

If $X, Y, Z$ are independent, then vector $(X,Y)$ and $Z$ are independent and hence $f((X, Y))$ is independent of $g(Z)$ for any Borel functions $f$ and $g$. It follows from definition immediately. Now put $f(a,b) = a+b$ and $g(a)=a$.

Problem2:

$S_X = \lim_{n \to \infty} S_X^n$, but what type of limit do you imply? A.s. convergence? Anyway for sums of independent r.v. we have that convergence a.s. is equivalent to convergence in probability and to convergence in distribution, so I will suppose that you imply convergence a.s. Hence $(S_X, S_Y)= \lim_{n \to \infty} (S_X^n, S_Y^n)$ a.s. and thus almost surely, thus a characteristic function of vector $S_X, S_Y$ is a limit of characteristic functions of $(S_X^n, S_Y^n)$: $$Ee^{i (uS_X + vS_Y)} = \lim_{n \to \infty}Ee^{i (uS_X^n + vS_Y^n)}$$ By independence $Ee^{i (uS_X^n + vS_Y^n)} = Ee^{i uS_X^n} Ee^{i ( vS_Y^n)}$. As $S_X = \lim_{n \to \infty} S_X^n$ we have $Ee^{i uS_X} = \lim_{n \to \infty}Ee^{i uS_X^n}$. So, $$Ee^{i (uS_X + vS_Y)} = \lim_{n \to \infty}Ee^{i (uS_X^n + vS_Y^n)} = \lim_{n \to \infty} Ee^{i uS_X^n} Ee^{i ( vS_Y^n)} = $$ $$ = \lim_{n \to \infty} Ee^{i uS_X^n} \lim_{n \to \infty} Ee^{i ( vS_Y^n)} =Ee^{i uS_X} Ee^{i vS_Y}.$$ Independence of $S_X$ and $S_Y$ follows from equality $Ee^{i (uS_X + vS_Y)} =Ee^{i uS_X} Ee^{i vS_Y}.$