I don't know if this answers the question but here are my two cents :
If we start from the "SDE" definition of Doléans-Dade exponential for a general semi-martingale $X_t$, then the Doléans-Dade exponential is the process $Z_t$ the solution of the following equation :
$$
\begin{cases}
dZ_t&=Z_{t-}dX_t,
\\
Z_0 &=1.
\end{cases}
$$
In discrete time this gives an anology which allows us to define the Doléans-Dade exponential as the only dicrete process s.t. :
$$
\begin{cases}
\Delta Z_n&=Z_{n-1}\Delta X_n,
\\
Z_0 &=1.
\end{cases}
$$
where $\Delta Y_n$ means $Y_n-Y_{n-1}$ for any discrete process $(Y_n)_{n\ge 0}$. That can be solved by recurence in the form :
$$
Z_n=\prod_{i=0}^{n}(1 +\Delta X_i)
$$
with the convention $\Delta X_0=0$, so that $Z_0=1$.
Notice that when expressing the solution for time continuous pure jumps semi-martingale you get almost the same answer ( check Jacod, Shiryaev "Limit Theorem for Stochastic Processes" at the end of Chapter 1).
(note that this exponential can take négative values !!!)
Don't know if this helps
Best regards
Yes, the stopped process of a predictable process is again a predictable process. It can be shown by a monotone class argument and the fact that the predictable sigma-field is generated by sets of the form
$$
A\times \{0\}, \;A\in\mathcal{F}_0\quad\text{and}\quad A\times (s,t],\;A\in\mathcal{F}_s,\;0\leq s<t.
$$
Also I'm pretty sure that your definition of $M_n$ should be $\int_0^n h_s\,\mathrm{d} X_s$.
To show the above result we let
$$
\mathcal{H}=\{(X_t)_{t\geq 0}\mid \text{if } (X_t)_{t\geq 0} \text{ is predictable, then }(X_t^{\tau})_{\geq 0} \text{ is also predictable for every s.t. }\tau \},
$$
and
$$
\mathcal{K}=\{(X_t)_{t\geq 0}\mid X_t(\omega)=1_{A\times \{0\}}(\omega,t),\; A\in\mathcal{F}_0\text{ or } X_t(\omega)=1_{A\times (s,r]}(\omega,t),\;0\leq s<r,\; A\in\mathcal{F}_s\},
$$
i.e. $\mathcal{K}$ is the elementary predictable processes. Then $\mathcal{H}$ is vector space containing all constant functions and it is closed under monotone bounded convergence (i.e. if $(f_n)\subseteq \mathcal{H}$ is a bounded increasing sequence, then $f=\sup_n f_n\in\mathcal{H}$) and $\mathcal{K}$ is stable under multiplication. Then we are in the scope of the monotone class theorem.
Now if we can show that $\mathcal{K}\subseteq \mathcal{H}$, then every bounded predictable process is in $\mathcal{H}$. For a general predictable process $X=(X_t)_{t\geq 0}$, we let $X^n=X\wedge n\vee -n$ and hence $X^n$ are bounded predictable processes for each $n\in\mathbb{N}$. In particular $X^n\in\mathcal{H}$ for each $n$ and therefore $(X^n)^{\tau}$ is predictable for each $n$, where $\tau$ is any stopping time. Now we get that $X^{\tau}$ is predictable because
$$
X^{\tau}_t(\omega)=\lim_{n\to\infty} (X^n)^{\tau}_t(\omega),\quad (\omega,t)\in \Omega\times [0,\infty).
$$
What is left is to show that $\mathcal{K}\subseteq\mathcal{H}$.
Best Answer
The candidate martingale has perhaps been miscopied. As per the comment by @Sesame, if instead one defines $M_n$ by $$M_n = \frac{\exp(t\sum_{i=1}^n Y_i)}{m_Y(t)^n},$$ then the usual computations carry through to show this is a martingale.
Summarizing the steps presented in the comment, we compute $$\mathbb{E}(M_{n+1}|\mathscr{F}_n)=\frac{1}{m_Y(t)^{n+1}}\exp(t \sum_{i=1}^n Y_i)\mathbb{E}(\exp(tY_{n+1})|\mathscr{F}_n)$$ $$=\frac{M_n}{m_Y(t)}\mathbb{E}(\exp(tY_{n+1}))=M_n$$ where we have, first, used the "pulling out what is known" property since $\exp(t \sum_{i=1}^n Y_i) \in \mathscr{F}_n$ and, second, used that $\exp(t Y_{n+1})$ is independent of $\mathscr{F}_n$. Thus $\mathbb{E}(M_{n+1}|\mathscr{F}_n)=M_n$, so $M_n$ is a martingale as desired.