Flip a coin, but you lose when tails appears

probability

I have the following game: you flip a coin (heads with probability $p$), and if you get heads you earn $1000$ dollars, and you can decide if you want to flip again. If not, you keep the money. However, if you get tails, the game is over and you go home with nothing.

I am having trouble calculating an optimal strategy for this game: I would like to maximize the winning amount by stopping at turn $f(p)$, but… what is $f(p)$?

If you did not lose all the money when tails appears, the expected winning value would just be a standard geometric distribution $\sum_{i=0}^\infty i p^{i-1} (1-p) = \frac{1}{1-p}$, but I am having trouble evaluating the risk at each turn.

Clearly never stopping is a good strategy only when $p=1$, but intuitively when $p$ is very close to $1$ then stopping after the first win is not the best strategy. I am confused on how to model this phenomenon.

Best Answer

You have to calculate the expected value of continuing. Suppose you have $x$ dollars in earnings, if you continue you can expect to earn $1000$ dollars at chance $p$ and lose $x$ dollars at chance $(1-p)$, so your expected earnings are $1000p-(1-p)x$. If this is value is positive carry on, if it is not quit. In this case this means you should stop if $x\geq 1000\frac{p}{1-p}$.

Related Question