Random Walk conditioned to stay positive is an h-process

conditional probabilitymartingalesprobabilityprobability theoryrandom walk

Let $X = (X_n)_{n \geq 0}$ be a simple symmetric random walk in $\mathbb{Z}$ starting from $X_0=0$ conditioned to be positive for all $n >0$.

I read in a paper that $X$ is the $h$-transform of a random walk, but I didn't understand precisely what does it mean. Moreover, I read that, since $X$ is an $h$-process, with $h(x)=x$, then for any $x > 0$.
$$
P(X_n = x ) = x P(S_n=x, S_k >0 \, \, \forall k = 1, \ldots, n ),
$$

where $S_k$ is a simple symmetric random walk (unconditioned).

Can the identity really be true? I understand that $P(X_n = x) = P(S_n=x \big | S_k > 0 \forall k =1, \ldots, n )$, which by definition of conditional probability equals the ratio $P(S_n=x, S_k > 0 \forall k =1, \ldots, n ) / P( S_k > 0 \forall k =1, \ldots, n )$, and since the denominator in this ratio is not equal to $1/x$ I don't understand how can the identity above be true. Probably I misunderstood the definition of $(X_n)_{n \in \mathbb{N}}$. Does anyone have an hint?

Best Answer

The solution can be achieved on a variety of levels of involvement. This one here might a hands-on solution (skipping some fiddly details).

Let $$\tau_0:=\inf\{n>0: X_n=0\}.$$ Then, the event $\{\tau_0=\infty\}$ means the Random Walk never returns to $0$.

Now, we can look at the condition never to reach $0$ as condition and ignore the fact that this event is actually zero for now. For convinience I'll write $$P_j(X_n=k):=P(X_n=k|X_0=j).$$ Using Bayes's Rule and Strong Markov Property we get $$P_j(X_n=k|\tau_0=\infty)=\frac{P_k(\tau_0=\infty)}{P_j(\tau_0=\infty)}\cdot P_j(X_n=k)$$

If we now define $h(k)=P_k(\tau_0=\infty)$, we can derive the following system of equations (first step analysis): $$h(0)=0$$ $$h(1)=\frac12 h(0)+\frac12h(2) = \frac12h(2)$$ $$\ldots$$ $$h(k)=\frac12 h(k-1)+ \frac12 h(k+1), \ \ \ k>1$$

This set of equations is obviously fulfilled for $h(k)=k$. We do not prove uniqueness of solution here.

Since the Random Walk has to walk up one step in its very first move to be positive thereafter, we can as well start in $1$ and thus get: $$P_1(X_n=k|\tau_0=\infty)=\frac k1 \cdot P_1(X_n=k)= k \cdot P_1(X_n=k).$$

Please note, the return time for a simple symmetric Random Walk is actually finite with probability one. Therefore we would have to go into finer detail and look at

$$h(k) := \lim_{\theta\rightarrow\infty}P_k(\tau_0>\theta)$$

That means, we condition on the non-zero event that return to zero happens after some fixed number of steps in the future. That is a common technique in probability.

Does this help a bit?

With regards, Peter