When gambling, do I get the money’s worth? (Or: Does the amount I lose per bet determine the number of bets until I lose all the money?)

expected valuegamblingprobability

This question came up when I asked on Puzzling.SE, How long will my money last at roulette? The basic question is: if I take $\$20$ to a roulette table which has a house edge of $1/37$, and I bet $\$10$ on each spin until I run out of money, what's the average amount of time before I lose all of my money?

The reasoning used in a couple of the answers is:

  • I start with $\$20$ and I bet $\$10$ per spin.
  • The expected value of amount I lose on each spin is $\frac{\$10}{37}$.
  • Therefore, the expected value of the number of spins required in order to lose all my money is $\$20 / \left (\frac{\$10}{37} \right) = 74$.

However, one of the commenters pointed out that this argument needs some additional justification, writing:

It looks to me as if you're assuming that "it takes an average of N turns to lose \$10" and "on average you lose \$10/N per turn" are equivalent — the first is what's obvious and the second is what you're using — but that seems like a thing that needs proving. Or am I missing the point somehow? – Gareth McCaughan♦

I'm pretty sure that the two statements are equivalent. In other words, for any "roulette-like" bet, the following equality holds:

$$E(\text{number of bets I make before losing $\$N$}) = \frac{\$N}{E(\text{amount of money I lose in one bet})}$$

However, I also know that probability often behaves in counterintuitive ways, so I wouldn't be very surprised if this equality turned out to be wrong.

Is the equality correct, and if so, how can it be proved?

Best Answer

We may as well say that you start with a bankroll of $2$ and bet $1$ each time.

Let $E_B$ be the expected number of rolls until the bankroll is gone, if the current bankroll is $B$. We have $$ E_B=1+{18\over 37}E_{B+1}+{19\over37}E_{B-1}$$ or $$18_{B+1}-37E_B+19E_{B-1}=-37\tag{1}$$ This is a straightforward second-order linear difference equation with constant coefficients, but we only have one initial value: $E_0 = 0.$ However, we know that $E_B=BE_1$ by linearity of expectation. Suppose I have a bankroll B. I put $B-1$ in my pocket and play with the remaining $1$ until it is gone. Then I take another $1$ out of my pocket and play until it is gone and so on. The expected number of plays until the whole bankroll is gone is clearly $B\cdot E_1.$

Substituting $E_B=tB$ into $(1)$ we find that $t=37$ so $E_B=37B$ and the suggested answer is correct.