Markov Chains: Prove that if $i$ is transient state, then $\pi_i = 0$

markov chainsprobability theory

Setup

Let $S$ be a countable set and $(X_n)_{n \in \mathbb{N_0}}$ a Markov chain with state space $S$ and transistion matrix $P=\{p_{ij}\}_{i,j \in S}$.

Let furthermore $T_i = \inf\{n \in \mathbb{N} \colon X_n = i\}$. A state $i$ is defined to be recurrent if $P(T_i < \infty \mid X_0 = i) = 1 $ and transient if $P(T_i < \infty \mid X_0 = i) < 1 $.

A stationary distribution (for $P$) is a vector $\pi = (\pi_i)_{i \in S}$ with $\sum_{i \in S} \pi_i = 1$ satisfying that $\pi = \pi P$.

Claim

Assume $\pi = (\pi_i)_{i \in S}$ is a stationary distribution. Prove that if $i$ is transient state, then $\pi_i = 0$.

Thoughts

I know that a transient state is only visited finitely many times P-almost surely. Maybe one can prove the claim using this? If we write $P^n = \{p_{ij}^n\}_{i,j \in S}$ then another property of a transient state $\sum_{n=1}^\infty p_{ii}^n < \infty$. But I can't quite make it work using this.

Best Answer

Here is a possible proof approach.

You say that you already know a transient state $i$ is almost surely visited only finitely many times. Can you show the stronger statement that the expectation of the number of visits to $i$ is finite?

Then, denoting the number of visits to $i$ as $N(i) = \sum_{k=0}^\infty 1(X_k=i)$, $$\mathbb{E}[N(i)] = \sum_{k=0}^\infty \mathbb{E}[1(X_k=i)] = \sum_{k=0}^\infty P(X_k=i) = \sum_{k=0}^\infty \pi_i$$ which can only be finite if $\pi_i=0$.