Gambler’s Ruin Problem: when Smith can never be broke

gamblingmarkov chainsprobabilityrandom walkstochastic-processes

Smith gambles in the following way: He starts with $i \geq 0$ dollars. At each step, he wins a dollar with with probability $\frac{1}{3}$ and loses a dollar with probability $\frac{2}{3}$. However, if he has 0 dollar and loses, he states at 0 dollar and can keep gambling. For example, if Smith has 1 dollar, loses twice and then wins, then he will have 1 dollar again. We are interested in the Markov Chain $(X_n)$ describing his fortune at time $n$.

  1. Give the transition matrix of $(X_n)$.
  2. Find, with proof, the limiting probability that Smith wins $i$ dollars at time $n$.

For 1, I'm confused about how to find the matrix, since it seems like that the state space is infinite.

For 2, I just have no idea how to proceed.

Please help.
Thank you!

Best Answer

The transition matrix is tridiagonal, with the (0,0) entry being 2/3 and other than that the main diagonal is 0, the upper diagonal is 1/3 and the lower diagonal is 2/3. Yes, the matrix is infinite. Limiting probability is $lim_{n\to \infty}A^n$ where $A$ is the transition matrix (though this may not be the easiest way to find the limiting distribution).

Here is MATLAB code to see an example (of course this example is finite, so the result will be off, but it doesn't hurt to see some behavior)

m=50; p=1/3; n= 10000;

main = zeros(1,m); up = p*ones(1,m-1); low = (1-p)*ones(1,m-1);

A = diag(main) + diag(up,1) + diag(low,-1); A[1,1]=1-p; L = A^n

Related Question