Markov chain with exponentially decreasing transition probabilities

markov chainsprobabilitystochastic-processes

I'm currently trying to figure out the steady-state vector of a particular infinite-state Markov chain. The chain has (countably) infinitely many states $S_0, S_1, \dots, S_i, \dots$, such that for all $i$:
$$
\begin{align}
\mathbb{P}(S_i \to S_i) &= 0 \\
\mathbb{P}(S_i \to S_{i+1}) &= \frac{1}{2^i} \\
\mathbb{P}(S_i \to S_{i-1}) &= 1-\frac{1}{2^i}.
\end{align}
$$

So far, all I can tell that if $\mathbf{p}$ is the steady-state vector for this chain, then $\lim_{i\to\infty} \mathbf{p}_i = 0.$ I know that for a finite-state Markov chain, the rows of $P^\infty$ are equal to the steady-state vector — I'm assuming that this holds for a Markov chain with infinitely many states as well? For Markov chains with a small number of states (i.e. two or three), I can usually find $P^n$ by hand with diagonalization, but in this case I feel like there should be an easier way.

How should I approach problems like this, and how would I find $\mathbf{p}$ in this case?

Best Answer

If I'm not mistaken, the steady-state vector satisfies $$p_n=\frac{2^n}{\prod_{k=1}^n(2^k-1)}p_0\tag1$$ The sum of the probabilities must equal $1$, and the series converges very rapidly. Summing the first few terms gives $$p_0\approx 0.20971122089755811$$

You can check this answer by substitution of $(1)$ into the formula in Ben's answer.

I arrived at this solution, by computing the first few terms in exact arithmetic. I recognized the numerators, or course, and I plugged the denominators into OEIS.

EDIT

I just realized I pasted in the wrong number for $p_0$. I've corrected it.

Related Question