We have to consider three cases separately:
- $S_{k-1}(\omega)<0$: Then $S_k(\omega) \leq 0$ and therefore $$|S_k(\omega)|-|S_{k-1}(\omega)| = -S_k(\omega)+ S_{k-1}(\omega) = - X_k(\omega).$$
- $S_{k-1}(\omega)=0$: Then $|S_k(\omega)|=1$ and therefore $$|S_k(\omega)|-|S_{k-1}(\omega)|=1.$$
- $S_{k-1}(\omega)>0$: Then $S_k(\omega) \geq 0$ and therefore $$|S_k(\omega)| - |S_{k-1}(\omega)| = S_k(\omega)-S_{k-1}(\omega) =X_k(\omega).$$
This shows
$$|S_k|-|S_{k-1}| = -X_k 1_{\{S_{k-1} <0\}}+ 1_{\{S_{k-1}=0\}}+ X_k 1_{\{S_{k-1}>0\}}. \tag{1}$$
Using this identity and the independence of the random variables, we get
$$\mathbb{E}(|S_k|-|S_{k-1}| \mid \mathcal{F}_{k-1}) = 1_{\{S_{k-1}=0\}}.$$
Consequently,
$$M_n = |S_n| - \sum_{k=1}^n 1_{\{S_{k-1}=0\}} = \sum_{k=1}^n (|S_k|-|S_{k-1}|-1_{\{S_{k-1}=0\}}).$$
By $(1)$, this implies
\begin{align}
M_n &= \sum_{k=1}^n (-X_k 1_{\{S_{k-1}<0\}}+X_k 1_{\{S_{k-1}>0\}})\\
&= \sum_{k=1}^n (- 1_{\{S_{k-1}<0\}}+ 1_{\{S_{k-1}>0\}})X_k\\
&= \sum_{k=1}^n (- 1_{\{S_{k-1}<0\}}+ 1_{\{S_{k-1}>0\}})(S_k-S_{k-1})\\
\\ &= (H \bullet S)_n\end{align}
for
$$H_k := - 1_{\{S_{k-1}<0\}}+ 1_{\{S_{k-1}>0\}}.$$
Using the It$\hat{o}$ formula for the function $f(X_{t},t)=(W_{t}^{2}-t)^{2}$ we have:
$f(x,t)=(x^{2}-t)^{2}$
$f_{x}(x,t)=4x(x^{2}-t)$
$f_{xx}(x,t)=4(3x^{2}-t)$
$f_{t}(x,t)=-2(x^{2}-t)$
$f(t,W_{t})=f(0,W_{0})+\int_{0}^{t}f_{t}(u,W_{u})du+\int_{0}^{t}f_{x}(u,W_{u})dW_{u}+\frac{1}{2}\int_{0}^{t}f_{xx}(u,W_{u})d[W]_{u}$
$f(t,W_{t})=0+\int_{0}^{t}-2(W_{u}^{2}-u)du+\int_{0}^{t}4W_{u}(W_{u}^{2}-u)dW_{u}+\frac{1}{2}\int_{0}^{t}4W_{u}(3W_{u}^{2}-u)d[W]_{u}$
but $[W]_{u}=u$ so
$f(t,W_{t})=\int_{0}^{t}-2(W_{u}^{2}-u)du+\int_{0}^{t}4W_{u}(W_{u}^{2}-u)dW_{u}+\frac{1}{2}\int_{0}^{t}4(3W_{u}^{2}-u)du$
$f(t,W_{t})=\int_{0}^{t}\left(-2(W_{u}^{2}-u)+2W_{u}(3W_{u}^{2}-u)\right)du+\int_{0}^{t}2(W_{u}^{2}-u)dW_{u}$
$f(t,W_{t})=M_{t}^{2}=\int_{0}^{t}4W_{u}^{2}du+\int_{0}^{t}4W_{u}(W_{u}^{2}-u)dW_{u}$
But the process $\int_{0}^{t}4W_{u}^{2}du$ is increasing.In addition is predictable.
$\int_{0}^{t}4W_{u}(W_{u}^{2}-u)dW_{u}$ is a Martingale because for $s\leq t$
$\mathbb{E}\left[\int_{0}^{t}4W_{u}(W_{u}^{2}-u)dW_{u}|\mathcal{F}_{s}\right]=\mathbb{E}\left[\int_{0}^{s}4W_{u}(W_{u}^{2}-u)dW_{u}|\mathcal{F}_{s}\right]+\mathbb{E}\left[\int_{s}^{t}4W_{u}(W_{u}^{2}-u)dW_{u}|\mathcal{F}_{s}\right]$
$\mathbb{E}\left[\int_{0}^{s}4W_{u}(W_{u}^{2}-u)dW_{u}|\mathcal{F}_{s}\right]=\int_{0}^{s}4W_{u}(W_{u}^{2}-u)dW_{u}$because $\int_{0}^{s}4W_{u}(W_{u}^{2}-u)dW_{u}$ $\mathcal{F}_{s}$-measurable
$ dW_{u}=W_{u+du}-W_{u} , W_{u+du}-W_{u}\sim N\left(0,du\right)$ and
$4W_{u}(W_{u}^{2}-u)$ independent of $W_{u+du}-W_{u} $
$\mathbb{E}\left[\int_{s}^{t}4W_{u}(W_{u}^{2}-u)dW_{u}|\mathcal{F}_{s}\right]=\int_{s}^{t}\left(\mathbb{E}\left[4W_{u}(W_{u}^{2}-u)\right]\mathbb{E}\left[dW_{u}\right]\right)|\mathcal{F}_{s}=0$
so $\mathbb{E}\left[\int_{0}^{t}4W_{u}(W_{u}^{2}-u)dW_{u}|\mathcal{F}_{s}\right]=\int_{0}^{s}4W_{u}(W_{u}^{2}-u)dW_{u}$ is Martingale
Best Answer
To get the specific coefficient $\lambda(2p-1)$, write down the infinitesimal matrix $A$ for the continuous-time Markov chain $X(t)$, and then apply $A$ to the identity function $f(i)=i$ (thought of as a column vector); call the resulting function $g:=Af$. (In this specific case, $g$ is the constant function with constant value $\lambda(2p-1)$.) Then $X(t) -\int_0^t g(X(s))\,ds$ is a martingale. The same device works for more general $f$ and more general continuous-time Markov chains, of which a compound Poisson process is a special case (for which $g$ is again constant).