Hi,
Here is a theorem that might answer your question (it is coming from Chesnay, Jeanblanc-Piqué and Yor's book "Mathematical Methods for Financial Markets").
It is theorem (11.2.8.1 page 621) here it is :
(edit note : be carefull as mentioned by G. Lowther there's a typo in the book regarding the domain of integration in the conditions over $\psi$ (defined hereafter) )
Let $X$ be an $R^d$ valued Lévy Process and $F^X$ its natural filtration. Let $M$ be an $F^X$-local Martingale. Then there exist an $R^d$-valued predictable process $\phi$ and an predictable function $\psi : R^+ \times \Omega \times R^d\to R$ such that :
-$\int_0^t \phi^i(s)^2ds <\infty$ almost surely
-$\int_0^t \int_{|x|> 1} |\psi(s,x)|ds\nu(dx) <\infty$ almost surely
-$\int_0^t \int_{|x|\le 1} \psi(s,x)^2ds\nu(dx) <\infty$ almost surely
and
$M_t=M_0+ \sum_{i=0}^d \int_0^t \phi^i(s)dW^i_s + \int_0^t \int_{R^d} \psi(s,x)\tilde{N}(ds,dx)$
Where $\tilde{N}(ds,dx)$ is the compensated measure of the Lévy process $X$ and $\nu$ the associated Lévy measure.
Moreover if $(M_t)$ is square integrable martingale then we have :
$E[(\int_0^t \phi^i(s)dW^i_s)^2]=E[\int_0^t \phi^i(s)^2ds]<\infty$
and
$E[(\int_0^t \int_{R^d} \psi(s,x)\tilde{N}(ds,dx))^2]=E[ \int_0^t ds \int_{R^d} \psi(s,x)^2\nu(dx)]<\infty$
and $\phi$ and $\psi$ are essentially unique.
The theorem is not proved in the book but there is a reference to the following parpers :
1/H. Kunita and S. Watanabe. On square integrable martingales. Nagoya J.
Math., 30:209–245, 1967
2/H. Kunita. Representation of martingales with jumps and applications to
mathematical finance. In H. Kunita, S. Watanabe, and Y. Takahashi, editors,
Stochastic Analysis and Related Topics in Kyoto. In honour of Kiyosi Itô,
Advanced studies in Pure mathematics, pages 209–233. Oxford University
Press, 2004.
Regards
The paper seems to be written rather carelessly. In particular, it is indeed unhelpful to denote a random object and its realizations by the same symbol, leaving the job of figuring out which is which here or there to the reader.
Further, it is clear that any sequence (say, the zero sequence) can be a realization of a random sequence $\xi$ (if e.g. $\xi$ is the sequence of iid normal random variables).
So, the result should be true, not for all realizations of $\xi$, but for almost all of them (excluding a set of realizations of probability $0$).
Indeed, the central point in the proof seems to be the use of the ergodic theorem in the middle of page 466 of the paper, where the convergence must of course be in the almost sure sense (but this is not specified on the paper).
Best Answer
$\newcommand\R{\mathbb R}\newcommand\ip[1]{\langle #1 \rangle}$In these notes, two related definitions of truncation functions are given. In Definition 5.6, a truncation function is defined as a bounded function $h\colon\R^d\to\R^d$ such that $h(x) = x$ in a neighborhood of $0$. In Definition 5.7, a truncation function $h'$ is defined as in your post. The relation between $h$ and $h'$ is given by the formula $h(x)=xh'(x)$ for all $x\in\R^d$.
In the proof of Theorem 5.13 in these notes, a truncation function $h$ as in Definition 5.6 is used. You are of course right that $h\equiv 0$ is not a valid truncation function according to either definition.
However, since in that proof $\rho$ is a probability measure, we can rewrite formula (5.25) in the notes as $$\hat\rho_n(u)= \exp\Big(ib_n+\int_{\R^d}\big(e^{i\ip{u,x}}-1-i\ip{u,h(x)}\big)\nu_n(dx)\Big),$$ where $h$ is any truncation function according to Definition 5.6 (e.g. $h(x)\equiv x\,1(|x|\le1)$), $\nu_n:=t_n^{-1}\rho^{t_n}$, and $b_n:=\int_{\R^d}\ip{u,h(x)}\,\nu_n(dx)\in\R$ (since the measure $\nu_n$ is finite). Then $\rho_n$ is infinitely divisible and has the Lévy--Khintchine representation with the triplet $(b_n,0,\nu_n)_h$, as desired.
(I would suggest using, instead of these notes, one of a number of known, more authoritative sources on infinitely divisible distributions.)