Expectation of function in Markov chain

markov chainsmarkov-processprobability theory

I have came upon the following expression while doing some work with Markov chains:

Let $f$ be a multivariable measurable function on the state space of the Markov chain. For $x$ in the state space $S$, I use the notation $P_x$ for the transition probability distribution on $S$ when the chain starts at point $x$. Fix a start state $s \in S$,

Can I simplify $$\mathbb{E}_{x\sim P_s, \, x' \sim P_s} (\mathbb{E}_{y \sim P_x, \, y' \sim P_{x'}} f(y,y')) $$
into a single expectation?

Best Answer

If $x, x'$ and $y, y'$ are independent, then you can essentially considering the product chain $Q_{(x,y)}((a,b)) = P_x(a)P_y(b)$ where each coordinate is moving independently. Then the quantity you are looking at is expectation of $f$ after two steps of the product chain $Q_{(s,s)}^2 f$ starting at $(s,s)$.

Related Question