Using Markov chain to solve simple random walk

markov chainsmarkov-processrandom walk

Problem: define a Markov chain $\{X_n\}$with states $S = \{0, 1, 2, \cdots\}$, the transition probability matrix is given by below, $P_{01} = 1$,
$$
P_{i, i+1} = \frac{i^2+2i+1}{2i^2+2i+1}, P_{i, i-1} = \frac{i^2}{2i^2+2i+1}
$$

for all $i \ge 1$, and $P_{ij} = 0, \forall |i-j|>1$.

(1). proof the Markov chain is transient;

(2). calculate $\rho_j = P(\tau < \infty | X_0 = j)$, where $\tau = \min_{n\ge 1}\{X_n=0\}$.

My attempt: the markov chain is similar to simple random walk in one dimension, and all states communicate to each other, it suffices to proof state $0$ is transient. Denote the probablity return to $0$ from state $0$ after $n$ steps by $P^{(n)}_{00}$.
Note that $P^{(2n-1)}_{00} = 0$, then

$$
\begin{align}
\sum_{n=1}^\infty P^{(n)}_{00} &= \sum_{n=1}^{\infty} P^{(2n)}_{00} \\
&=\sum_{n=1}^{\infty} {2n \choose n}\Pi_{k=1}^nP_{k-1, k}P_{k, k-1} \\
&=\sum_{n=1}^{\infty} {2n \choose n}\Pi_{k=1}^n\frac{k^2}{2k^2-2k+1}\frac{k^2}{2k^2+2k+1} \\
&=\sum_{n=1}^{\infty} {2n \choose n}\Pi_{k=1}^n\frac{k^4}{4k^4+1} \\
&=\sum_{n=1}^{\infty} (2n)!\Pi_{k=1}^n\frac{k^2}{4k^4+1}
\end{align}
$$

Then I need to proof the series above converges and got stuck here. Any hint is appreciate.

Best Answer

A possible strategy to answer both questions is to put $\tau_m = \{\min n \ge 0 : X_n = m\}$, to fix $m \ge 1$, to check that $P[ \tau_0 \wedge \tau_m < +\infty | X_0=x ] = 1$ whenever $0 \le x \le m$ (see below), and to compute $$f_m(x) := P[\tau_0 < \tau_m | X_0=x] \textrm{ for } 0 \le x \le m.$$ To do this, apply Markov property at time $1$, which yields $$f_m(x) = P_{x,x+1}f_m(x+1) + P_{x,x-1}f_m(x-1) \textrm{ whenever } 1 \le x \le m-1.$$ This equation can be written alternatively $$0 = x^2(f_m(x+1)-f_m(x)) + (x+1)^2(f_m(x-1)-f_m(x)),$$ $$\frac{f_m(x)-f_m(x+1)}{(x+1)^2} = \frac{f_m(x-1)-f_m(x)}{x^2}.$$ As a result, $(f_m(x-1)-f_m(x))/x^2$ does not depend of $x \in [1,m]$. Then observe that $f_m(0)=1$ and $f_m(m)=0$ to deduce the value of $f_m(x)$ for $x \in [0,m]$.

Once $f_m(x)$ is computed, it suffices to let $m$ go to infinity. In particular, if $x \ge 1$, we get
$$P[\tau_0<+\infty|X_0=w] = \lim_{m \to +\infty}P[\tau_0<\tau_m|X_0=w] < 1,$$ so the state $0$ (and every state, since all states commuicate) is transient.

Proof that $P[\tau_m<+\infty|X_0=x]=1$ whenever $0 \le x \le m$. For every $y \in [0,m]$, $P[\tau_m \le m|x_0=x]>0$. Call $a$ the minimum of those positive probabilities. Then for $k \ge 0$, Markov property applied at times $km$ yields $$P[\tau_m > (k+1)m|X_0=x] = E[1_{\tau_m > km}g(X_{km})|X_0=x], \text{ where } g(y):= P[\tau_m>m|X_0=y].$$ On the event $[\tau_m > km]$, $g(X_{km}) \le 1-a$, so $$P[\tau_m > (k+1)m|X_0=x] \le (1-a) P[\tau_m > km|X_0=x].$$ A recursion yields $$P[\tau_m > km|X_0=x] \le (1-a)^k.$$ The conclusion follows.

Related Question