Let $Y(j)$ be i.i.d. with finite mean and variance, and set
$\mu=\mathbb{E}(Y)$ and $\tau=\sqrt{\mathbb{E}(Y^2)}$.
If $(N(t))$ is an independent Poisson process with rate $\lambda$,
then the compound Poisson process is defined as
$$X(t)=\sum_{j=0}^{N(t)} Y(j).$$
The characteristic function of $X(t)$ is calculated as follows:
for real $s$ we have
\begin{eqnarray*}
\psi(s)&=&\mathbb{E}\left(e^{is X(t)}\right)\cr
&=&\sum_{j=0}^\infty \mathbb{E}\left(e^{is X(t)} \ | \ N(t)=j\right) \mathbb{P}(N(t)=j)\cr
&=&\sum_{j=0}^\infty \mathbb{E}\left(e^{is (Y(1)+\cdots +Y(j))} \ | \ N(t)=j\right) \mathbb{P}(N(t)=j)\cr
&=&\sum_{j=0}^\infty \mathbb{E}\left(e^{is (Y(1)+\cdots +Y(j))}\right) \mathbb{P}(N(t)=j)\cr
&=&\sum_{j=0}^\infty \phi_Y(s)^j {(\lambda t)^j\over j!} e^{-\lambda t}\cr
&=& \exp(\lambda t [\phi_Y(s)-1])
\end{eqnarray*}
where $\phi_Y$ is the characteristic function of $Y$.
From this we easily calculate $\mu(t):=\mathbb{E}(X(t))=\lambda t \mu$
and $\sigma(t):=\sigma(X(t))= \sqrt{\lambda t} \tau$.
Take the expansion $\phi_Y(s)=1+is\mu -s^2\tau^2 /2+o(s^2)$ and substitute it into
the characteristic function of the normalized random variable ${(X(t)-\mu(t)) /\sigma(t)}$ to obtain
\begin{eqnarray*}
\psi^*(s) &=& \exp(-is(\mu(t)/\sigma(t))) \exp(\lambda t [\phi_Y(s/\sigma(t))-1]) \
&=& \exp(-s^2/2 +o(1))
\end{eqnarray*}
where $o(1)$ goes to zero as $t\to\infty$. This gives the central limit theorem
$${X(t)-\mu(t)\over\sigma(t)}\Rightarrow N(0,1).$$
We may replace $\sigma(t)$, for example, with $\tau \sqrt{N(t)}$ to get
$${X(t)-\mu(t)\over\tau \sqrt{N(t)}}= {X(t)-\mu(t)\over\sigma(t)} \sqrt{\lambda t \over N(t)} \Rightarrow N(0,1),$$
by Slutsky's theorem, since $\sqrt{\lambda t \over N(t)}\to 1$ in probability by the law of large numbers.
Added: Let $\sigma=\sqrt{\mathbb{E}(Y^2)-\mathbb{E}(Y)^2}$ be the standard deviation of $Y$,
and define the sequence of standardized random variables
$$T(n)={\sum_{j=1}^n Y(j) -n\mu\over\sigma\sqrt{n}},$$
so that
$${X(t)-\mu N(t)\over \sigma \sqrt{N(t)}}=T(N(t)).$$
Let $f$ be a bounded, continuous function on $\mathbb{R}$. By the usual
central limit theorem we have $\mathbb{E}(f(T(n)))\to \mathbb{E}(f(Z))$ where
$Z$ is a standard normal random variable.
We have for any $N>1$,
$$\begin{eqnarray*}
|\mathbb{E}(f(T(N(t)))) - \mathbb{E}(f(Z))|
&=& \sum_{n=0}^\infty |\mathbb{E}(f(T(n)) - \mathbb{E}(f(Z))|\ \mathbb{P}(N(t)=n) \cr
&\leq& 2\|f\|_\infty \mathbb{P}(N(t)\leq N) +\sup_{n>N} |\mathbb{E}(f(T(n)))- \mathbb{E}(f(Z)) |.
\end{eqnarray*}
$$
First choosing $N$ large to make the right hand side small, then letting $t\to\infty$ so
that $\mathbb{P}(N(t)\leq N)\to 0$, shows that
$$ \mathbb{E}(f(T(N(t)))) \to \mathbb{E}(f(Z)). $$
This shows that $T(N(t))$ converges in distribution to a standard normal as $t\to\infty$.
If you know the infinitesimal description of the Poisson process then you can proceed in the following manner.
Without thinning, we have
$
\begin{align*}
P(N_{t+h}-N_t=0)&=1-\lambda h+o(h), \\
P(N_{t+h}-N_t=1)&=\lambda h+o(h)\quad\text{and} \\
P(N_{t+h}-N_t\geq2)&=o(h)
\end{align*}
$
where $\lambda$ is the intensity. If you have thinning with probability $p$ of retaining the points then the intensity should become $p\lambda$. To show this we should show that
$
\begin{align*}
P(N'_{t+h}-N'_t=0)&=1-p\lambda h+o(h), \\
P(N'_{t+h}-N'_t=1)&=p\lambda h+o(h)\quad\text{and} \\
P(N'_{t+h}-N'_t\geq2)&=o(h).
\end{align*}
$
Since these are similar to show, I will demonstrate the first one.
$
\begin{align*}
P(N'_{t+h}-N'_t=0)&=P(N'_{t+h}-N'_{t}=0\mid N_{t+h}-N_t=0)(1-\lambda h+o(h)) \\
&\quad+P(N'_{t+h}-N'_t=0\mid N_{t+h}-N_t=1)(\lambda h+o(h)) \\
&\quad+P(N'_{t+h}-N'_t=0\mid N_{t+h}-N_t\geq2)(o(h)) \\
&=(1-\lambda h+o(h))+(1-p)(\lambda h+o(h))+o(h) \\
&=1-p\lambda h+o(h).
\end{align*}
$
Best Answer
Using Law of Total Expectations ("Tower Rule"), we have $$\mathbb{E}\left[ e^{i\theta X_t}\right]= \mathbb{E}\left[\mathbb{E}\left[ e^{i\theta X_t}|N_t\right]\right]$$
Now, $$\mathbb{E}\left[ e^{i\theta X_t}|N_t=n\right]=\mathbb{E}\left[ exp\left\{i\theta \sum_{i=1}^{n}\xi_i\right\}\Big|N_t=n\right]= \prod_{i=1}^{n}\mathbb{E}\left[ e^{i\theta \xi_i}\right]= \left(\mathbb{E}\left[ e^{i\theta \xi_1}\right]\right)^n$$ where we used the fact that $\xi_i$'s are independent and identically distributed. This computes the inner expectations.
Now, we need to compute the outer expectations. We have $$ \mathbb{E}\left[\mathbb{E}\left[ e^{i\theta X_t}|N_t\right]\right]=\mathbb{E}\left[\left(\mathbb{E}\left[ e^{i\theta \xi_1}\right]\right)^{N_t}\right]=\sum_{n=0}^{\infty}e^{-\lambda t}\frac{(\lambda t)^n}{n!}\left(\mathbb{E}\left[e^{i\theta \xi_1}\right]\right)^{n} \\ =e^{-\lambda t} \sum_{n=0}^{\infty}\frac{(\lambda t \mathbb{E}\left[e^{i\theta \xi_1}\right])^n}{n!}$$ We recognize that the above sum is simply equal to $$ \sum_{n=0}^{\infty}\frac{(\lambda t \mathbb{E}\left[e^{i\theta \xi_1}\right])^n}{n!}= e^{\lambda t \mathbb{E}\left[e^{i\theta \xi_1}\right]} = exp\left\{\lambda t\int e^{i\theta x}\mu(dx)\right\}$$ Thus, we obtain $$ \mathbb{E}\left[ e^{i\theta X_t}\right] =exp\left\{\lambda t\int \left(e^{i\theta x}-1 \right)\mu(dx)\right\} $$ as required.