Probability – Kelly Criterion with More Than Two Outcomes

gamblingprobability

I want to calculate the Kelly bet for an event with more than two possible outcomes. Suppose the following game:

A jar contains $10$ jelly beans. There are $7$ black jelly beans, $2$ blue jelly beans, and $1$ red jelly bean. The player wagers $x$ and grabs a single jelly bean randomly from the bag. The payouts are such:

  • Black Jelly Bean: no payout (i.e. simply lose wager amount $x$)
  • Blue Jelly Bean: net odds received on the wager = $10$
  • Red Jelly Bean: net odds received on the wager = $30$

In essence the only way to lose the bet is to grab a black jelly bean (i.e. $q = 0.7$). But the net odds received on the wager is still dependent on whether the player grabs a blue ($b = 10$) or red ($b = 30$) jelly bean.

How would I calculate the Kelly bet for this game?


Is it correct to simply calculate the Kelly bet for each positive outcome and then find the weighted average for the final wager? For example:

$$x_b = \frac{10\times0.2 – 0.8}{10} = 0.12$$

$$x_r = \frac{30\times0.1 – 0.9}{30} = 0.07$$

$$x = \frac{0.12\times0.2 + 0.07\times0.1}{0.2 + 0.1} \approx 0.103$$

So the amount to wager would be 10.3% of the bankroll.

Or should I have instead found the weighted average of the net odds received on the wager and then calculated the Kelly bet based on the winning outcomes as a whole (i.e. $p = 0.1 + 0.2 = 0.3$)? For example:

$$b = \frac{10\times0.2 + 30\times0.1}{0.2 + 0.1} \approx 16.7$$

$$x = \frac{16.7\times0.3 – 0.7}{16.7} \approx 0.258 $$

So the amount to wager would be 25.8% of the bankroll.

Best Answer

Return to the derivation of the Kelly criterion: Suppose you have $n$ outcomes, which happen with probabilities $p_1$, $p_2$, ..., $p_n$. If outcome $i$ happens, you multiply your bet by $b_i$ (and get back the original bet as well). So for you, $(p_1, p_2, p_3) = (0.7, 0.2, 0.1)$ and $(b_1, b_2, b_3) = (-1, 10, 30)$.

If you have $M$ dollars and bet $xM$ dollars, then the expected value of the log of your bankroll at the next step is $$\sum p_i \log((1-x) M + x M + b_i x M) = \sum p_i \log (1+b_i x) + \log M.$$ You want to maximize $\sum p_i \log(1+b_i x)$. (See most discussions of the Kelly criterion for why this is the right thing to maximize, for example, this one.)

So we want $$\frac{d}{dx} \sum p_i \log(1+b_i x) =0$$ or $$\sum \frac{p_i b_i}{1+b_i x} =0.$$

I don't see a simple formula for the root of this equation, but any computer algebra system will get you a good numeric answer. In your example, we want to maximize $$f(x) = 0.7 \log(1-x) + 0.2 \log(1+10 x) + 0.1 \log (1+30 x)$$ enter image description here

I get that the optimum occurs at $x=0.248$, with $f(0.248) = 0.263$. In other words, if you bet a little under a quarter of your bankroll, you should expect your bankroll to grow on average by $e^{0.263} = 1.30$ for every bet.