Wage x dollars in biased coin flip

probabilityprobability theory

You have 100 dollars. You are playing a game where you wager x dollars on a biased coin flip with a 90 percent probability of heads. You make 2x if it’s heads and lose the x dollars if it’s tails. How much do you think you should bet on each flip if you are going to play for 100 flips?

My approach:
If I wage x dollars then, 90% of them time (heads), I would win 2x – x = x dollars and 10% of the time (tails) I would lose x dollars.
Hence my Expected value E[x] = 0.9(x) + 0.1(-x) = 0.8x which is lesser than how much I wage.

Best Answer

If you just want to maximize your expected winnings, then the best strategy is to bet your entire stake every time. This is because when you bet $x$, your expected gain is $0.9\cdot x+0.1\cdot (-x)$, which is monotonically increasing in $x$. This strategy works like the lottery; you lose your initial $\$100$ with overwhelming probability, but with the small chance of $(0.9)^{100}$, you win a huge prize of $\$(2^{100}-1)\cdot 100$.

Most people would agree this strategy is stupid. Instead, we should try to maximize our "worst case" winnings, so we can be confident we will win a lot of money. We cannot guarantee making money, but we can try to maximize the number $t$ such that $P(\text{winnings}\ge t)=0.99$. It turns out the best strategy in this case is given by the Kelly criterion; you should bet $80\%$ of your current wealth each time.

Let $X_0,X_1,\dots,X_{100}$ be your current wealth at each stage of the game, so $X_0=100$, while $X_{100}$ is your final wealth. Furthermore, let $r_1,r_2,\dots,r_{100}$ be the proportion of your current wealth that you place on each bet, so you bet $r_iX_{i-1}$ on the $i^\text{th}$ bet. Note that each $0\le r_i\le 1$, and $$ X_{i}/X_{i-1}=\begin{cases} 1+r_i & \text{with probability }0.9\\ 1-r_i & \text{with probability }0.1 \end{cases} $$ That is, the ratio of our wealth is a simple, bounded random variable. We can then write $$ X_{100}=100\cdot (X_1/X_0)\cdot (X_2/X_1)\cdot \cdots (X_{100}/X_{99}) $$ so our wealth is a product of these independent random variables. We now take $\log$'s of both sides, since addition is more nice than multiplication. $$ \ln X_{100}=\ln 100+\sum_{i=1}^{100}\ln(X_{i}/X_{i-1}) $$ We have written $\ln X_{100}$ as a sum of independent random variables. The central limit theorem then implies that $\ln X_{100}$ is approximately normally distributed*, with mean equal to $\ln 100$ plus the sum of the means of $\ln (X_i/X_{i-1})$, and variance equal to the sum of the variances of $\ln(X_i/X_{i-1})$. This normal variable is pretty tightly concentrated around its mean, because as you add variables, the mean grows linearly, while the standard deviation only with the square root. This implies, in order to maximize the $99\%$ security level of your overall profit, you should maximize the expected value of $\ln (X_i/X_{i-1})$ for each $i$, which entails maximizing $$ 0.9\ln (1+r_i)+0.1\cdot \ln(1-r_i) $$ Using calculus, you can show this is maximized when $r_i=0.8$, which is exactly the Kelly criterion.

* Since the variables are not identically distributed in the case where the $r_i$ are unequal, you need something like the Lyapunov central limit theorem here. Basically, since each $\ln(X_i/X_{i-1})$ contributes a small amount to the total variance, we still get a normal distribution.