Expected value game theory paradox

game theoryprobability

Suppose there is a magical slot machine at a casino. You can start with however big of a bet you want, and for each pull of the slot machine, you have a 99% chance of doubling your money and a 1% of losing your money.

The game has the following rules:

1) You can only play this game once in your life. You keep pulling until you quit the game or lose your money.

2) You must bet all the money that you have gained since you started plus your starting bet. For example, if you start with a \$10 bet, then on the first pull, you have a 1% chance of losing the money, and a 99% of getting to \$20. If you want to pull again, you must bet \$20, in which case you would have a 1% chance of losing the money and a 99% chance of getting to \$40. This continues until you lose or quit the game.

This seems to be like a very winning bet, where your expected value is almost double your starting bet. Game theory would tell you that when your "return on investment" is greater than 0, you should theoretically take the bet. However, if you always take the bet, then you would just keep playing the game until you eventually lose all your profit and your original bet, making your return on the investment negative for the whole game.

It also doesn't make sense to quit after any pull because the pulls are independent events from each other.

I tried calculating the expected value after n pulls, and found that this diverges, meaning you should never stop pulling.

Here is what I got when I tried to calculate the expected value:

Starting with an initial bet of $x$,

EV = $(.99 \cdot 2)^nx$

after n pulls on the machine. To maximize the expected value (the goal of the game), it seems like you would let $n \rightarrow \infty$. This obviously isn't right because after infinity pulls you are guaranteed to lose all your money and have a negative ROI.

Best Answer

If you just keep playing you're almost surely going to lose all your money eventually. But for each single game, the winning move is to pay one more time. So your conclusion is correct: The expected winnings from playing $n$ times (or until you lose, whichever comes first) will just continue to grow exponentially as $n$ increases.

One way to think about it is that if you decide from the start that you want to play $n$ times for some number $n$, then you have a $0.99^n$ probability of winning $2^{n-1}$ times your original bet, while you have an $1-0.99^n$ probability chance of just losing your original bet. That amount of money if you win is so much larger than the amount of money you stand to lose, at such great probability that the expected value is quite high.