Gambler’s ruin for St Petersburg paradox

gamblingmarkov chainsprobabilityprobability theory

Suppose that someone with an infinite amount of capital is charging $\$100$ to play the St Petersburg game. Each game consists of flipping a coin repeatedly until the result is tails, and your net winnings for that game are $2^{(\text{# flips})}-100$ in dollars. You can play this game as many times as you like, as long as you have $\$100$ to spend.

If you have an initial stake of one million dollars, is there a nonzero probability that you will never go broke playing this game forever?

This is just a problem I came up with out of curiosity. This is reminiscent of the gambler's ruin problem against an infinite bank, where the bet is in your favor. It is well known that if you make a series of bets on the outcome of an unfair coin with $P(\text{heads})=p>1/2$, where you win one dollar for heads and lose one dollar for tails with each bet, then the probability you eventually go broke with an initial stake of $\$n$ is $$1-\left({1-p\over p}\right)^n.$$ This probability is nonzero, and approaches $1$ as your initial stake $n\to\infty$. It is easy to prove this by setting up a linear recurrence; if you let $a_n$ be the probability you go broke starting from $n$, then $a_n=pa_{n+1}+(1-p)a_{n-1}$, whose solution is exactly $1-((1-p)/p)^n$.

However, in the St Petersburg case, the recurrence is unbounded. Namely, if we let $b_n$ be the probability you eventually go broke (meaning you cannot afford another game) starting with $\$n$, then the recurrence you get is
$$
b_n=\tfrac12b_{n-99}+\tfrac14 b_{n-98}+\tfrac18 b_{n-96}+\dots+ \tfrac1{256} b_{n+28}+\tfrac1{512}b_{n+156}+\dots
$$

where $b_n=1$ whenever $n<100$. I have no idea how to solve this, as this seems to complicated to apply generating functions to. I still suspect strongly there is a nonzero chance you never go broke, since the St Petersburg game has infinite expectation in your favor. Does anyone know how to rigorously prove that $b_n<1$ for all $n\ge 100$?

Best Answer

Let $X_n$ be the net profit to the player for the $n^{th}$ game, let $S_n=X_1+\dots+X_n$, and $$ \tau=\inf\{n:S_n<0\}. $$ My problem is equivalent to asking whether $P(\tau<\infty)<1$. Indeed, if $P(\tau<\infty)=1$, then you will certainly eventually be down a dollar from when you started, so iterating this a million times means you will certainly be a million dollars below your start.

If it were true that $P(\tau<\infty)=1$, then with probability one there would exist an infinite sequence of indices $\tau(1)<\tau(2)<\tau(3)<\dots$ so that $S_{\tau(1)}>S_{\tau(2)}>S_{\tau(3)}>\dots$. Specifically, let $\tau(1)=\tau$, then inductively define $\tau(n)=\inf\{n:n>\tau(n-1),S_{n}<S_{\tau(n-1)}\}$, and notice that $\tau(n+1)-\tau(n)$ has the same distribution as $\tau$ for all $n\in \Bbb N$. But the law of large numbers implies $S_n\to +\infty$ with probability $1$, so this infinite decreasing subsequence will not exist with probability one.