If you're willing to assume that $\sigma(\sigma(X)\cup \mathcal{H})$ is independent of $\sigma(\mathcal{G})$ and that $X$ is integrable, then the assertion is indeed true. We need to show that
$$
E[X\mid \sigma(\mathcal{G}\cup\mathcal{H})]=E[X\mid\mathcal{H}],
$$
that is, we need to show that $E[X\mid\mathcal{H}]$ can serve as the conditional expectation of $X$ given $\sigma(\mathcal{G}\cup\mathcal{H})$, i.e. show that
- $E[X\mid\mathcal{H}]$ is $\sigma(\mathcal{G}\cup\mathcal{H})$-measurable,
- $E[X\mid\mathcal{H}]$ is integrable,
- for all $A\in\sigma(\mathcal{G}\cup\mathcal{H})$: $$\int_A E[X\mid\mathcal{H}]\,\mathrm dP=\int_A X\,\mathrm dP.$$
The first two are obvious. For the third, note that (by linearity, we can assume that $X$ is non-negative)
$$
\sigma(\mathcal{G}\cup\mathcal{H})\ni A\mapsto \int_AE[X\mid\mathcal{H}]\,\mathrm dP
$$
and
$$
\sigma(\mathcal{G}\cup\mathcal{H})\ni A\mapsto \int_A X\,\mathrm dP
$$
are two measures defined on $\sigma(\mathcal{G}\cup\mathcal{H})$ with equal total mass being $E[X]$. Hence, it is enough to show that the two measures are identical on some $\cap$-stable generator of $\sigma(\mathcal{G}\cup\mathcal{H})$. Here, we use that
$$
\{A\cap B\mid A\in\mathcal{G},\,B\in\mathcal{H}\}
$$
is indeed a $\cap$-stable generator of $\sigma(\mathcal{G}\cup\mathcal{H})$ (why?) and hence it suffices to show that
$$
\int_{A\cap B} E[X\mid \mathcal{H}]\,\mathrm dP=\int_{A\cap B} X\,\mathrm dP,\quad A\in\mathcal{G},\,B\in\mathcal{H}.
$$
Try to show this using the independence assumption. I think it will be clear to you that we in fact need the stronger independence assumption.
A counterexample showing that we indeed need the stronger assumption is the following: Let $U$ and $V$ be i.i.d. symmetric Bernoulli variables (i.e. $P(U=-1)=P(U=1)=\tfrac12$), $\mathcal{G}=\sigma(U)$, $\mathcal{H}=\sigma(V)$ and $X=UV$. Now, one can show that $X$ and $U$ are independent by showing that
$$
P(X=a,U=b)=P(X=a)P(U=b)
$$
for every combination of $a,b\in \{0,1\}$. But $\sigma(\sigma(X)\cup\sigma(V))$ is not independent of $\sigma(U)$ since for example
$$
P(X=1,V=1,U=1)\neq P(X=1,V=1)P(U=1),
$$
and hence we do not have the stronger independence assumption. In this case,
$$
E[X\mid \sigma(\mathcal{G}\cup\mathcal{H})]=UV\neq E[U]V=E[X\mid\mathcal{H}].
$$
Yes. The equality $\mathbb{E}[e^{itX}|\mathcal{G}]=\mathbb{E}[e^{itX}]$ means that for any $A\in\mathcal G$ and for all $t$
$$
\mathbb{E}[e^{itX} \mathbb 1_A]=\mathbb{E}[e^{itX}]\cdot\mathbb P(A).
$$
And also $A^c\in\mathcal G$, so this equality holds for $A^c$ as well.
Let us use Kac's theorem
$$
\mathbb{E}[e^{itX} e^{is\mathbb 1_A}]=\mathbb{E}[e^{itX}\left(e^{is}\mathbb 1_A+\mathbb 1_{A^c}\right)]=e^{is}\mathbb{E}[e^{itX}\mathbb 1_A]+\mathbb{E}[e^{itX}\mathbb 1_{A^c}]
$$
$$
= e^{is} \mathbb{E}[e^{itX}]\mathbb P(A) + \mathbb{E}[e^{itX}]\mathbb P(A^c) = \mathbb{E}[e^{itX}]\left(e^{is}\mathbb P(A) +\mathbb P(A^c)\right) = \mathbb{E}[e^{itX}]\mathbb{E}[e^{is\mathbb 1_A}].
$$
We can see that for all $t,s\in\mathbb R$, the joint characteristic function of $X,\mathbb 1_A$ is a product of c.f.'s. Kac's theorem implies that $X$ and $\mathbb 1_A$ are independent. And since it holds for all $A\in\mathcal G$, $X$ and $\mathcal G$ are independent.
Best Answer
This is not true. Let $U,V$ be i.i.d. $N(0,1)$, $X=U+V, Y=U$ and $Z=U-V$. It is well known (and easy to prove) that $X$ and $Z$ are independent. But $E[X|Y,Z]=U+V$ and $E[X|Y]=E[U+V|U]=U$. So $\mathcal G=\sigma (Y), \mathcal H=\sigma (Z)$ provides a counter-example.