Strong Markov Property – Durrett

markov chainsprobability theory

I recently had great success with my first question here so I will boldly go on to a second.
Here goes:

I'm studying Markov Chains in Rick Durrett – Probability: Theory and example and I'm stuck with the definition of the strong markov property – I know more or less what it should be, but do not understand his way of saying it. I'm gonna give you a lot of information, hopefully enough but please ask for more if you need it.

Some definitions: We have some nice (1-1 map between S and R) measurable space $(S,\mathcal{S)}$ and we then define
$$\Omega=\{{(\omega_{1},\omega_{2},…):\omega_{i}\in\text{}S}\}$$
$$\mathcal{F}=\mathcal{S}\times\mathcal{S}\times…$$
$$P=\mu\times\mu\times…\text{where }\mu\text{ is the distribution of }X_{i}$$
$$X_{n}(\omega)=\omega_{n}$$

We have $$P(X_{j}\in B_{J},0\leq j \leq n)=\int_{B_{0}}\mu(dx_{0})\int_{B_{1}}p(x_{0},dx_{1})…\int_{B_{n}}p(x_{n-1},dx_{n})$$
Where the p's are transition probabilities (for fixed x (first variable) it's a probability measure and fixed set (second variable) a measurable function).
The probability measure is consistent so Kolmogorov's extension theorem gives us the infinite one (as I understand).

His definition is then as follows:

Suppose that for each n, $Y_{n}:\Omega\rightarrow\mathbb{R}$ is measurable and $|Y_{n}|\leq M\; \forall n$ Then
$$E_{\mu}(Y_{N}\circ\theta_{N}|\mathcal{F}_{N})=E_{X_{N}}Y_{N}\quad on\:\{N<\infty\}$$
N is a stoptime and theta a shift operator ("drops the first N elements of the omega-sequence")

So i know i am being a bit imprecise here – I reckon I know what all the elements of the theorem are, but have trouble adding it all up. I hope someone bothers to help.

Thanks in advance,
Henrik

Update v3.b:

I almost figured it out, I will update with my findings shortly (hope someone cares).

So I still have some problems; can someone help me with these notions:
$P_{x}=P_{\delta_{x}}$, why $P_{\mu}(A)=\int P_{x}(A)\, \mu(dx)$ and what $E_{X_{n}}$ looks like explicitly.

Best Answer

Markov chains are irrelevant here hence let us concentrate on the strong Markov property. Start from any process $(Z_n)_{n\geqslant0}$ (this is just a collection of random variables defined on the same probability space) and call $F_n$ the sigma-algebra generated by $(Z_k)_{k\leqslant n}$.

Then, the simple Markov property is to ask that $$ \mathrm E(\varphi(Z_n,Z_{n+1},Z_{n+2},\ldots)\mid F_n)=\mathrm E(\varphi(Z_n,Z_{n+1},Z_{n+2},\ldots)\mid X_n), $$ almost surely, for every $n\geqslant0$ and for every (for example) bounded measurable function $\varphi$.

Likewise, the strong Markov property is to ask that $$ \mathrm E(\varphi(Z_T,Z_{T+1},Z_{T+2},\ldots)\mid F_T)=\mathrm E(\varphi(Z_T,Z_{T+1},Z_{T+2},\ldots)\mid X_T), $$ almost surely on the event $[T\lt\infty]$, for every (for example) bounded measurable function $\varphi$ and for every stopping time $T$. (At this point, I assume you know what a stopping time $T$ is and what the sigma-algebra $F_T$ generated by a stopping time $T$ is.)

Since constants are stopping times, the strong Markov property implies the simple Markov property. For processes in discrete time, the two notions coincide (but be warned that this is not the case anymore for processes in continuous time).

Related Question