Independence of random variable to sigma-algebra generated by previous ones

conditional-expectationmartingalesmeasure-theoryprobability

Let $(X_n)$ be a sequence of iid random variables such that $\mathbb E(X_n)=0, \Bbb E(X_{n+1}^2)$ $\forall n$.

In a martingales context, we denote a finite sum of these random variables $S_n:=\sum_{i=1}^n X_i$ and $F_n = \sigma(S_1,\dots,S_n)$. We are interested in finding $\mathbb E(X_{n+1}^2 \rvert F_n) + \mathbb E(X_{n+1} \rvert F_n)$.

It is clear that we need to show that $X_{n+1}$ and $F_n$ are independent. However, I struggle to show it rigorously, it seems very ugly.

So I start by taking a set in $\sigma(X_{n+1})$ say $X_{n+1}^{-1}(A), A\in \mathcal B(\Bbb R)$ and another one in $\sigma(S_1,\dots,S_n)$ for which I don't exactly know an explicit description, is it something like $$S:=X_1^{-1}(B_1) \cup (X_1+X_2)^{-1}(B_2) \cup (X_1+\dots X_n)^{-1}(B_n) \quad ?$$
and then argue that if $B_2=(-\infty,t),$ then $\{w : X_1(w) + X_2(w) < t\} = \bigcup_{r\in\Bbb Q} [X_1^{-1}((-\infty, r)) \cap X_2^{-1}((-\infty, t-r))]$ so that $S$ can be expressed as a union of pre images of Borelian sets by $X_1,\dots,X_n$ so that we obtain something of the form $$\Bbb P(X_{n+1}^{-1}(A) \cap \big(\bigcup_{i=1}^n \bigcup_{j\in J} X_i^{-1} (C_j)\big))$$

However I still cannot use independence of $X_{n+1}$ and other $X_i$ to get the result unless I suppose that $\bigcup_{i=1}^n \bigcup_{j\in J} X_i^{-1} (C_j)$ is disjoint union.

Am I doing the right thing and it is possible to finish this proof or is there another way ?

Best Answer

It is clear that if $\mathcal F$ is a $\sigma$-algebra for which $X_1,\dots,X_n$ are measurable, then $S_1,\dots,S_n$ are also $\mathcal F$-measurable. Therefore $$\sigma(S_1,\dots,S_n)\subseteq \sigma(X_1,\dots,X_n).$$

Now, since $X_{n+1}$ is independent of $\sigma(X_1,\dots,X_n)$, it must also be true that $X_{n+1}$ is independent of $\sigma(S_1,\dots,S_n)$.

Related Question