Probability – Understanding the Magic Money Tree Problem in Econometrics

econometricsmartingalemathematical-statisticsprobabilityrandom walk

I thought of this problem in the shower, it was inspired by investment strategies.

Let's say there was a magic money tree. Every day, you can offer an amount of money to the money tree and it will either triple it, or destroy it with 50/50 probability. You immediately notice that on average you will gain money by doing this and are eager to take advantage of the money tree. However, if you offered all your money at once, you would have a 50% of losing all your money. Unacceptable! You are a pretty risk-averse person, so you decide to come up with a strategy. You want to minimize the odds of losing everything, but you also want to make as much money as you can! You come up with the following: every day, you offer 20% of your current capital to the money tree. Assuming the lowest you can offer is 1 cent, it would take a 31 loss streak to lose all your money if you started with 10 dollars. What's more, the more cash you earn, the longer the losing streak needs to be for you to lose everything, amazing! You quickly start earning loads of cash. But then an idea pops into your head: you can just offer 30% each day and make way more money! But wait, why not offer 35%? 50%? One day, with big dollar signs in your eyes you run up to the money tree with all your millions and offer up 100% of your cash, which the money tree promptly burns. The next day you get a job at McDonalds.

Is there an optimal percentage of your cash you can offer without losing it all?

(sub) questions:

If there is an optimal percentage you should offer, is this static (i.e. 20% every day) or should the percentage grow as your capital increases?

By offering 20% every day, do the odds of losing all your money decrease or increase over time? Is there a percentage of money from where the odds of losing all your money increase over time?

Best Answer

This is a well-known problem. It is called a Kelly bet. The answer, by the way, is 1/3rd. It is equivalent to maximizing the log utility of wealth.

Kelly began with taking time to infinity and then solving backward. Since you can always express returns in terms of continuous compounding, then you can also reverse the process and express it in logs. I am going to use the log utility explanation, but the log utility is a convenience. If you are maximizing wealth as $n\to\infty$ then you will end up with a function that works out to be the same as log utility. If $b$ is the payout odds, and $p$ is the probability of winning, and $X$ is the percentage of wealth invested, then the following derivation will work.

For a binary bet, $E(\log(X))=p\log(1+bX)+(1-p)\log(1-X)$, for a single period and unit wealth.

$$\frac{d}{dX}{E[\log(x)]}=\frac{d}{dX}[p\log(1+bX)+(1-p)\log(1-X)]$$ $$=\frac{pb}{1+bX}-\frac{1-p}{1-X}$$

Setting the derivative to zero to find the extrema,

$$\frac{pb}{1+bX}-\frac{1-p}{1-X}=0$$

Cross multiplying, you end up with $$pb(1-X)-(1-p)(1+bX)=0$$ $$pb-pbX-1-bX+p+pbX=0$$ $$bX=pb-1+p$$ $$X=\frac{bp-(1-p)}{b}$$

In your case, $$X=\frac{3\times\frac{1}{2}-(1-\frac{1}{2})}{3}=\frac{1}{3}.$$

You can readily expand this to multiple or continuous outcomes by solving the expected utility of wealth over a joint probability distribution, choosing the allocations and subject to any constraints. Interestingly, if you perform it in this manner, by including constraints, such as the ability to meet mortgage payments and so forth, then you have accounted for your total set of risks and so you have a risk-adjusted or at least risk-controlled solution.

Desiderata The actual purpose of the original research had to do with how much to gamble based on a noisy signal. In the specific case, how much to gamble on a noisy electronic signal where it indicated the launch of nuclear weapons by the Soviet Union. There have been several near launches by both the United States and Russia, obviously in error. How much do you gamble on a signal?

Related Question