Details of the proof of the Exit Time Theorem

probabilityprobability theorystochastic-analysisstochastic-processes

Theorem 1.29. Let $V_{A}=\inf \left\{n \geq 0: X_{n} \in A\right\} .$ Suppose $C=S-A$ is finite, and that $P_{x}\left(V_{A}<\infty\right)>0$ for any $x \in C .$ If $g(a)=0$ for all $a \in A,$ and for $x \in C$ we have
$$
g(x)=1+\sum_{y} p(x, y) g(y)
$$

Then $g(x)=E_{x}\left(V_{A}\right)$

Proof. It follows from Lemma 1.3 that $E_{x} V_{A}<\infty$ for all $x \in C .(1.27)$ implies that $g(x)=1+E_{x} g\left(X_{1}\right)$ when $x \notin A .$ The Markov property implies

$$
g(x)=E_{x}\left(V_{A} \wedge n\right)+E_{x} g\left(X_{V_{A} \wedge n}\right)
$$

We have to stop at time $T$ because the equation is not valid for $x \in A .$ It follows from the definition of the expected value that $E_{x}\left(V_{A} \wedge n\right) \uparrow E_{x} V_{A} .$ Since $g$ is bounded and $g(a)=0$ for $a \in A,$ we have $E_{x} g\left(X_{V_{A} \wedge n}\right) \rightarrow 0$

This theorem 1.29 is from Richard Durrett's 《Essentials of Stochastic the Processes Third Edition》(Page 62-63).In this theorem, g(x) is expected exit time with initial state x.The hint is simply to use the Markov property,but I finally can't prove it.Could someone show me the details of the proof that $g(x)=1+E_{x} g\left(X_{1}\right)$ is equivalent to $
g(x)=E_{x}\left(V_{A} \wedge n\right)+E_{x} g\left(X_{V_{A} \wedge n}\right)
$
?Thank you very much!

Best Answer

One way to proceed is by induction. The case $n=0$ is clear. Suppose the stated identity has been shown for $n$, and let's check it for $n+1$. First, $$ E_xg(V_A\wedge (n+1)) = E_x[g(X_{V_A}); V_A\le n]+E_x[g(X_{n+1}); V_A>n]. $$ The first term on the right vanishes because $X_{V_A}\in A$ on $\{V_A<\infty\}$. In dealing with the second, notice that $\{V_A>n\}$ is $\mathcal F_n$-measurable. Here is where to use the Markov property to see that $$ \eqalign{ E_x[g(X_{n+1}); V_A>n] &=E_x\left[ E_x\left[g(X_{n+1})\mid \mathcal F_n\right]; V_A>n\right]\cr &=E_x\left[\sum_y p(X_n,y) g(y); V_A>n\right]\cr &=E_x\left[g(X_n)-1; V_A>n\right]\cr &=E_x[g(X_{V_A\wedge n}]-E_x[g(X_{V_A}; V_A\le n]-P[V_A>n]\cr &=E_x[g(X_{V_A\wedge n}]-P[V_A>n]\cr } $$ Turning to the left side, we can break the expectaion in two as before: $$ \eqalign{ E_x[V_A\wedge (n+1)] &=E_x[V_A; V_A\le n]+E_x[n+1: V_A>n]\cr &=E_x[V_A\wedge n]+P_x[V_A>n]\cr } $$ Adding these two and using the induction hypothesis you see that the identity holds for step $n+1$.

Related Question