In the case where you have two variables, the following holds:
Suppose $X_1$ has Poisson distribution with parameter $\lambda_1$ and $X_2$ has Poisson distribution with parameter $\lambda_2$. Then if $X_1$ and $X_2$ are independent, the variable $X_1+X_2$ has Poisson distribution with parameter $\lambda_1+\lambda_2$.
This is intuitively clear if we regard the variables as relating to Poisson processes with common unit time. $X_1$ gives the number of events occurring in a unit time period where the average number of events per unit time period is $\lambda_1$. $X_2$ gives the number of events occurring in a unit time period where the average number of events per unit time period is $\lambda_2$. By the independence assumption, the total number of events from both processes occurring in a unit time period would be $X_1+X_2$, and the average number of these events per unit time period would be $\lambda_1+\lambda_2$. So, $X_1+X_2$ has Poisson distribution with parameter $\lambda_1+\lambda_2$.
Rigorously, we can compute the probability mass function, $p_Y$, of $Y=X_1+X_2$ as follows:
For our variables $X_1$ and $X_2$, we
have for $i\ge0$:
$$P[X_1=i]= {\lambda_1^i\over i!} e^{-\lambda_1}\quad\text{and}\quad P[X_2=i]= {\lambda_2^i\over i!} e^{-\lambda_2}.$$
Let $k\ge0$. Then:
$$
\eqalign{
p_Y(k)
&=\sum_{i=0}^kP[X_1=i,X_2=k-i]\cr
&=\sum_{i=0}^kP[X_1=i]\cdot P[ X_2=k-i]\cr
&=\sum_{i=0}^k{\lambda_1^i\over i!}e^{-\lambda_1}\cdot{\lambda_2^{k-i}\over(k-i)!}e^{-\lambda_2}\cr
&=\sum_{i=0}^k{\lambda_1^i\lambda_2^{k-i}\over i!\,(k-i)!}e^{-(\lambda_1+\lambda_2)}\cr
&= e^{-(\lambda_1+\lambda_2)}\cdot\sum_{i=0}^k{\lambda_1^i\lambda_2^{k-i}\over i!\,(k-i)!}\cr
&={1\over k!}\cdot e^{-(\lambda_1+\lambda_2)}\cdot\sum_{i=0}^k{k!\over i!\,(k-i)!}\,\lambda_1^i\lambda_2^{k-i}\cr
% &={1\over k!}\cdot e^{-(\lambda_1+\lambda_2)}\cdot\sum_{i=1}^k{k!\over i!\,(k-i)!}\,\lambda_1^i\lambda_2^{k-i}\cr
&={(\lambda_1+\lambda_2)^k\over k!}\cdot e^{-(\lambda_1+\lambda_2)},\cr
}
$$
where the second equality above used the independence of $X_1$ and $X_2$ and the last equality used the Binomial Theorem.
So, $$ p_Y(k)= {(\lambda_1+\lambda_2)^k\over k!}\cdot e^{-(\lambda_1+\lambda_2)},\quad k\ge 0 ; $$
which we recognize as the Poisson distribution with parameter $\lambda_1+\lambda_2$.
The result for $n\ge 1$ now follows easily by induction.
Best Answer
When $N \sim \operatorname{Poisson}(\lambda)$, we would employ the law of total expectation. Let $$S = \sum_{i=1}^N X_i,$$ so we have $$M_S(t) = \operatorname{E}[e^{tS}] = \operatorname{E}[\operatorname{E}[e^{tS} \mid N]].$$ You have already calculated the interior, conditional expectation: $$\operatorname{E}[e^{tS} \mid N] = \exp\left(N\theta(e^t - 1)\right).$$ Thus we need to compute $$M_S(t) = \operatorname{E}\left[\exp\left(N\theta(e^t - 1)\right)\right].$$ But you can observe that this is simply the MGF of $N$ evaluated at $\theta(e^t - 1)$, and since $N$ is Poisson with MGF $$M_N(u) = e^{\lambda(e^u - 1)},$$ we have $$M_S(t) = M_N(\theta(e^t-1)) = \exp\left(\lambda(e^{\theta(e^t-1)} - 1)\right).$$