The statement of this question has witnessed so many patches and modifications that I find difficult to spot the exact hypotheses. Hence I will formulate a question which might or might not be the one that interests you, and I will answer it. The idea is that if this is not the question you wanted to ask, at least you will be able to set things straight starting from this answer.
Here we go. You are given two independent sequences $(X_n)$ and $(Z_n)$, which you further assume to be independent from each other. You consider some sigma-algebras $F_n$, such that, for every $n$, $(X_{n+1},Z_{n+1})$ is independent on $F_n$, and, for some given (sufficiently integrable) functions $f_n$, the random variables $Y_n=f_n(X_n,Z_n)-\mathrm E(f_n(X_n,Z_n)\mid X_n)$.
Note that $F_n$ might be the sigma-algebra generated by the random variables $X_k$ and $Z_k$ for every $k\leqslant n$, or the sigma-algebra generated by the random variables $X_k$ for every $k\leqslant n$, or any other sigma-algebra which fits the bill that $(X_{n+1},Z_{n+1})$ is independent on $F_n$.
The question is whether $\mathrm E(Y_{n+1}\mid F_n)=0$, almost surely.
The answer is yes and the proof is direct. Since $Y_{n+1}$ is measurable with respect to $(X_{n+1},Z_{n+1})$ which is independent from $F_n$, one knows that $\mathrm E(Y_{n+1}\mid F_n)=\mathrm E(Y_{n+1})$, almost surely. By the tower property, the expectations of the random variables $f_{n+1}(X_{n+1},Z_{n+1})$ and $\mathrm E(f_{n+1}(X_{n+1},Z_{n+1})\mid X_{n+1})$ are both $\mathrm E(f_{n+1}(X_{n+1},Z_{n+1}))$. Hence $\mathrm E(Y_{n+1}\mid F_n)=0$, almost surely.
Now, it is up to you: shoot...
Edit Allright, so you shot (see the comments)... As a result, the setting is still not entirely clear but seems to be closer to the following than to what I wrote above.
Consider independent random variables $X_0$ and $Z_n$ for $n\geqslant0$. Assume that $(Z_n)_{n\geqslant0}$ is i.i.d. For every $n\geqslant0$, define the random variables
$$
X_{n+1}=h(X_n,Z_n),\quad
U_n=f(X_n,Z_n),\quad
Y_n=U_n-\mathrm E(U_n\mid X_n).
$$
Let us fix $n\geqslant0$, let $G_n$ denote any sigma-algebra such that $(X_n,Z_n)$ is $G_n$-measurable and $Z_{n+1}$ is independent on $G_n$, and let us prove that $\mathrm E(Y_{n+1}\mid G_n)=0$, almost surely.
Consider the function $g$ defined on the state space of every $X_n$ by
$g(x)=\mathrm E(f(x,Z_1))$.
Then, on the one hand, $X_{n+1}=h(X_n,Z_n)$ is $G_n$-measurable, $Z_{n+1}$ is independent on $G_n$ and $U_{n+1}=f(X_{n+1},Z_{n+1})$, hence $\mathrm E(U_{n+1}\mid G_n)=g(X_{n+1})$. Now, $\mathrm E(U_{n+1}\mid G_n)$ is $X_{n+1}$-measurable hence $\mathrm E(U_{n+1}\mid X_{n+1})=\mathrm E(\mathrm E(U_{n+1}\mid G_n)\mid X_{n+1})=g(X_{n+1})$ and $\mathrm E(Y_{n+1}\mid G_n)=0$.
This proves that $(M_n)_{n\geqslant0}$ is an $(F_n)_{n\geqslant0}$ martingale, where $M_0=0$ and $M_n=Y_1+\cdots+Y_n$ for every $n\geqslant1$, as soon as $(F_n)_{n\geqslant0}$ denotes a filtration (hence $F_n\subseteq F_{n+1}$ for every $n\geqslant0$) such that each $F_n$ satisfies the conditions we put on $G_n$ above. This means that for every $n\geqslant0$, $F_n$ is independent on $\sigma(Z_{n+1})$ and $H_n\subseteq F_n$, where
$$
H_n=\sigma(X_0)\vee\sigma(Z_k;0\leqslant k\leqslant n).
$$
An example of such a filtration is $F_n=H_n$ for every $n\geqslant0$, but each $F_n$ may contain some extraneous information.
Best Answer
A Filtration is a growing sequence of sigma algebras $$\mathcal{F_1}\subseteq \mathcal{F_2}\ldots \subseteq \mathcal{F_n}.$$
Now when talking of martingales we need to talk of conditional expectations, and in particular conditional expectations w.r.t $\sigma$ algebra's. So whenever we write $$ E[Y_n|X_1,X_2,\ldots,X_n]$$
we can alternatively write it as $$E[Y_{n+1}| \mathcal{F_{n}}],$$ where $\mathcal{F}_{n}$ is a sigma algebra that makes random variables $$X_1,\ldots,X_n$$ measurable. Finally a flitration $\mathcal{F_1},\ldots \mathcal{F_n}$ is simply an increasing sequence of simga algebras. That is we are conditioning on growing amounts of information.