Markov chain and stationary distribution

markov chainsprobability theorystochastic-processes

Consider an irreducible Markov chain $X=(X_k)_k$ taking values in a finite set $F$ and with a transition matrix $(P(x,y))_{x,y \in F}.$ Let $\pi$ be a stationary distribution of $X$ and $h:F \to \mathbb{R}$ a function.

Prove that there exists a function $f:F \to \mathbb{R}$ such that $$\forall x \in F,Pf(x)-f(x)=h(x)-\sum_{y \in F}h(y) \pi(y).$$

I tried to find a function verifying the above with no success. How can we define or construct a function $f$ with the above property?

Best Answer

Let $h_0(x):= \sum_{y \in F}h(y) \pi(y)-h(x)$, so the task is to solve $$(I-P)f=h_0 \quad (*) \,. $$ It is natural to attempt this using the operator version of the geometric series $(1-p)^{-1}=1+p+p^2+\ldots$, but convergence of the series $\sum_k P^k h_0$ may fail if $P$ is periodic, so we consider the matrix $\widetilde{P}=(P+I)/2$ instead. The Markov chain with transition matrix $\widetilde{P}$ (known as the lazy version of $P$) is irreducible and aperiodic, so the basic convergence theorem (see [1] or Theorem 4.9 in [2]) ensures that there are $\alpha<1$ and $C <\infty$ such that the total variation distance satisfies $$\forall k \ge 1, \quad \max_x \|\widetilde{P}^k(x,\cdot)-\pi\|_{TV} \le C\alpha^k\,.$$ In view of [2, Prop. 4.5] (or see [3]) this implies that $$\forall k \ge 1, \quad \|\widetilde{P}^k h_0\|_\infty \le C\alpha^k\,.$$ Thus we can define $$g:=\sum_{k=0}^\infty \widetilde{P}^k h_0 \,,$$ and find that $ (I-\widetilde{P})g=h_0$. Since $I-P=2(I-\widetilde{P})$, the function $f:=g/2$ solves $(*)$.

[1] James Robert Norris. Markov chains. Cambridge university press, 1998.

[2] Markov chains and mixing times, second edition, by David Levin and Yuval Peres, AMS (2017), https://yuvalperes.com/markov-chains-and-mixing-times-2/ https://pages.uoregon.edu/dlevin/MARKOV/mcmt2e.pdf

[3] Saloff-Coste, Laurent. "Lectures on finite Markov chains." Lectures on probability theory and statistics (1997): 301-413.

Related Question