Two-horse race probabilities

probabilityrisk-assessment

I have a rather basic question about probabilities, and this is a reference request.

Assume you have 100 units of something, and you have to bet the 100 units into two events, A or B. You can distribute the money, for instance 50-50.

If event $A$ happens, you win $r_A$ times your input and if $B$ happens, you win $r_B$ times your input. Think of $A$ and $B$ a two-horse race and you have to put your 100 units (money laundering or whatever).

My question is, what is the best choice here? Because if one writes the expected reward as
$$x r_A P(A) + (100-x) r_B P(B),$$
and you want to optimize that, then it is clearly saying that you have to put all of your input $x$ either on $A$ or $B$. Intuitively, that is wrong, because if $A$ pays infinite but has a tiny possibility of happening, I wouldn't bet on $A$, or I would put some insurance on $B$, as not to lose everything.

The intuition is to minimize the risk. Is there a standard way of doing this?

Thanks

Best Answer

One way which I think is standard in some applications is to maximise the expected logarithm of the ratio of outcome to original amount (this comes to the same thing as maximising the expected logarithm of the outcome, of course). The rationale for doing this is that if you have to make similar bets several times, each time wagering all your current capital, then the expected logarithm of the ratio of final outcome to starting capital is going to be the sum of the expected logarithm of the ratios for individual bets. Losing all your money is very bad because you can never recover from it.

However, doing this here gives another unexpected answer. The expected logarithm of the ratio is $$P(A)\log x+P(A)\log r_A+P(B)\log(100-x)+P(B)r_B-\log100.$$ Most of these terms are constant, and Gibbs' inequality implies that the best thing to do is to set $$x=100\times\frac{P(A)}{P(A)+P(B)},$$ which doesn't depend on the payouts!

This method applied to betting is called the "Kelly criterion".