A winning wager that loses over time

conditional-expectationexpected valuefinancegambling

This problem was posted in Scientific American (vol. 321.5, Nov 2019, p. 73), and it was troubling.

The game:

We flip a fair coin.
If we flip heads we gain 20% of our bet
If we flip tails we lose 17% of our bet.
Starting bankroll: $100

Stipulations: We must bet all the chips we have, cannot reload, and we must play at least 10 flips.

Note- There's no minimum bet, in the sense that we can keep playing this game with an infinitesimally small chip stack. Since we're always betting a fraction of our bankroll, there's no risk of ruin.


The expected value is net positive. Using the first flip as an example:
EV = (0.5 * $20) – (0.5 * $17) = + $1.50

For 10 flips considering all outcomes:
EV = +16.05%
https://docs.google.com/spreadsheets/d/15nXStFnsEHFU938erWaCKDVMmFdaOVeHNJ-taQZLcJs/edit?usp=drivesdk


However, there's another way to look at this. When we win our bet is multiplied by 1.2. When we lose our bet is multiplied by 0.83.

Let's say we win one, lose one: 1.2*0.83 = 0.996

If we win 5 flips and lose 5 flips in any order:
1.2 x 1.2 x 1.2 x 1.2 x 1.2 x 0.83 x 0.83 x 0.83 x 0.83 x 0.83 = 0.9802, for a net loss of -$1.98.

Losses "hit harder" than wins. From this perspective it looks like a disadvantageous game.

Put technically, the geometric mean is less than 1, which means we expect our bankroll to shrink on average.

In this game the Kelly Criterion says we should risk no more than 7.5%, however we're forced to risk more than double that, 17%. It's a well known principle that betting more then twice Kelly will result in a shrinking bankroll. This is just another way to say the Geometric mean of growth is less than 1.


Scientific American claims the casino will make money over time. They claim that this holds true even if you flip 100 or 10,000 times or more.

I can't wrap my head around the fact that a +EV bet could lose over time in a game with no risk of ruin. I suspect this has to do with "volatility drag". While I can understand the math, I can't intuitively wrap my head around it.

So my question is, should you play this game?

Best Answer

Here are all the possible outcomes if you stop after ten tosses of the coin. I have rounded the numbers for display, but the calculations were at the full precision of the software on which they were computed. In the table, $H$ is a random variable equal to the number of heads in the sequence of flips, and $X$ is a random variable equal to the amount of your money on the table at the end of the sequence. The last number in the fifth or seventh column is the sum of the numbers in the column above it, so the last number in the fifth column is the expected value of the money on the table at the end of the game, computed in the usual way.

\begin{array}{ccccccc} h & P(H=h) & P(H\leq h) & X & XP(H=h) & \log_{10}X & (\log_{10}X)P(H=h) \\ \phantom{0} 0 & 0.000977 & 0.000977 &\phantom{0} 15.52 &\phantom{00} 0.02 & 1.1908 & 0.0012 \\ \phantom{0} 1 & 0.009766 & 0.010742 &\phantom{0} 22.43 &\phantom{00} 0.22 & 1.3509 & 0.0132 \\ \phantom{0} 2 & 0.043945 & 0.054688 &\phantom{0} 32.43 &\phantom{00} 1.43 & 1.5110 & 0.0664 \\ \phantom{0} 3 & 0.117188 & 0.171875 &\phantom{0} 46.89 &\phantom{00} 5.50 & 1.6711 & 0.1958 \\ \phantom{0} 4 & 0.205078 & 0.376953 &\phantom{0} 67.79 &\phantom{0} 13.90 & 1.8312 & 0.3755 \\ \phantom{0} 5 & 0.246094 & 0.623047 &\phantom{0} 98.02 &\phantom{0} 24.12 & 1.9913 & 0.4900 \\ \phantom{0} 6 & 0.205078 & 0.828125 & 141.71 &\phantom{0} 29.06 & 2.1514 & 0.4412 \\ \phantom{0} 7 & 0.117188 & 0.945313 & 204.88 &\phantom{0} 24.01 & 2.3115 & 0.2709 \\ \phantom{0} 8 & 0.043945 & 0.989258 & 296.21 &\phantom{0} 13.02 & 2.4716 & 0.1086 \\ \phantom{0} 9 & 0.009766 & 0.999023 & 428.26 &\phantom{00} 4.18 & 2.6317 & 0.0257 \\ 10 & 0.000977 & 1.000000 & 619.17 &\phantom{00} 0.60 & 2.7918 & 0.0027 \\ & & & & 116.05 & & 1.9913 \end{array}

So we see you have about a $62\%$ chance of losing money, although less than $38\%$ chance to lose more than two dollars. On the other hand you could win over $\$500$; the chance of winning at least $\$100$ is over $17\%$, while the chance of losing $\$100$ is zero. Add up the ordinary expected value of the game, and it comes out to an expected gain of about $\$16.05.$

But if we take the expected base-ten logarithm of the amount of your money on the table after ten tosses (the number at the bottom of the last column), we see that it is only about $1.9913,$ whereas the base-ten logarithm of your starting amount is exactly $2.$

What does this mean? It means that when you measure your outcomes according to a utility function that is not necessarily identical to the raw outcomes (the number of dollars on the table), the expected utility of the game can be positive or negative, depending on your utility function. If your utility function is $f(x) = x$, this game has positive expected utility, but if your utility function $f(x) = \log_{10} x,$ this game has negative expected utility. That is, $E(X - 100) > 0$ but $E(\log_{10} X - \log_{10} 100) < 0.$

Note that using a different base of the logarithm will apply a different scaling factor to the utility, but it will still be negative. I used base $10$ because it gives a nice round number for the initial utility, so it is easy to see when you gain and when you lose.

Note that if we apply logarithmic utility in this way to a game where you put $\$100$ on the table and then flip a fair coin once, double or nothing, you have a $\frac12$ chance to increase your utility by about $0.3$ and a $\frac12$ chance to decrease your utility by $\infty.$ This is an absurd result, and is due to the absurdity of using "logarithm of money on the table" as a utility function. A somewhat more reasonable utility function for a casino game is to assume that you have some reserves of some kind somewhere that you do not put on the table, and take the logarithm of the sum of your reserves plus the money on the table. Suppose we only count the money in your bank account as these "reserves", and assume you had only $\$1000$ at the start of the day and withdrew $\$100$ in order to play the game. Then the utility of your initial state is $\log_{10}1000 = 3,$ and the utility of your outcome is $\log_{10}(900 + X).$ We get the following results:

\begin{array}{ccccc} h & P(H=h) & P(H\leq h) & 900 + X & \log_{10}(900+X) & (\log_{10}(900+X))P(H=h) \\ \phantom{0} 0 & 0.000977 & \phantom{0} 15.52 &\phantom{0}915.52 & 2.9617 & 0.0029 \\ \phantom{0} 1 & 0.009766 & \phantom{0} 22.43 &\phantom{0}922.43 & 2.9649 & 0.0290 \\ \phantom{0} 2 & 0.043945 & \phantom{0} 32.43 &\phantom{0}932.43 & 2.9696 & 0.1305 \\ \phantom{0} 3 & 0.117188 & \phantom{0} 46.89 &\phantom{0}946.89 & 2.9763 & 0.3488 \\ \phantom{0} 4 & 0.205078 & \phantom{0} 67.79 &\phantom{0}967.79 & 2.9858 & 0.6123 \\ \phantom{0} 5 & 0.246094 & \phantom{0} 98.02 &\phantom{0}998.02 & 2.9991 & 0.7381 \\ \phantom{0} 6 & 0.205078 & 141.71 & 1041.71 & 3.0177 & 0.6189 \\ \phantom{0} 7 & 0.117188 & 204.88 & 1104.88 & 3.0433 & 0.3566 \\ \phantom{0} 8 & 0.043945 & 296.21 & 1196.21 & 3.0778 & 0.1353 \\ \phantom{0} 9 & 0.009766 & 428.26 & 1328.26 & 3.1233 & 0.0305 \\ 10 & 0.000977 & 619.17 & 1519.17 & 3.1816 & 0.0031 \\ & & & & & 3.0059 \end{array}

Since your starting utility was $3,$ this is an expected gain.

Note that one of the assumptions of the research on which the Scientific American article is based is that you're not just taking $\$100$ out of your bank account to play the game; you are effectively putting your entire wealth on the table. That's what makes the losses in that game so devastating. Compare this to a lottery where you have a $0.1\%$ chance to win a large multiple of the ticket price and a $99.9\%$ chance to win nothing. A lottery like this where the price of a single ticket is "everything you own" is a very different matter from a lottery where a ticket costs a dollar.