[Math] How to buy a car optimally in this case

probability

X has a car. Its value is unknown yet, but between 0 and 1000, uniformly distributed

You offer a price to buy the car.


If price < value, you can’t buy.


If price >= value, you can buy, give the price of money to X.

(for example, if value is 200, and you offer 300, you can buy, but you must give 300 to X)


If your buy is successful, Y will pay you the money of 1.5 * value to buy this car himself.

(for example, if value is 200, you offer 300, you can buy, give 300 to X, and then Y pays you 1.5*200 = 300, and he takes the car)


I think first of all, I need to define the optimal case, which is at the end I earn more money than the price I offered.

Assume value is 200.

If I offer too low price, nothing will happen, not make much sense

If I offer 300, buy the car and then sell the car to Y for 300, in the end, I still have 300, unchanged and makes not much sense.

If I offer 250, then I will earn 50 in the end, this makes sense.

If I offer 400, then I actually loose 100, this is even worse.

What should I do?


Update

Here is my thinking.

Let m be the price I am going to offer and v be the value of the car.

What I want is after all things (including I don't buy the car successfully), the money in my hand is still at least m.

There are three cases:

1. I can't buy the car, i.e., v > m

Because 0 <= v <= 1000, the probability of v > m is (1000-m)/1000.

2. I buy the car and I get less than m in the end, i.e., m > 1.5v

The probability is m/1500

3. I buy the car and I get more than m in the end, i.e., 1.5v >= m >= v

The probability is 1 - (1000-m)/1000 - m/1500 = m/3000.

I wish to have case 1 and case 3, so (1000-m)/1000 + m/3000 > m/1500. solve this I get 0 <= m <= 791.

As long as I offer a price between 0 and 791, I have bigger possibility to get extra money. (image if I play this for 10000 times)

Best Answer

Your optimal strategy is to offer 0, i.e. don't buy the car. Intuitively, you can see this because if you pay $p$ to buy the car from X, there's a 50% chance that the car's value $v$ is less than half of $p$, but Y will only pay you $1.5\times$ $v$. You would need Y to pay you at least $2\times$ $v$ to break even on average.

You can make this more formal by defining $V$ to be the price of the car, uniformly distributed between 0 and 1 (working in units of 1000s is easier). Let $p$ be the price you offer.

Then (1) the probability of the price being more than the value is $$P(p \ge V) = p$$ by the definition of the uniform distribution. The profit $r$ in this case is the money Y pays you, $1.5v$, less the price you paid $p$. If we let $Q$ be a random variable for the value of the car given we only know the value is $\le p$ then we can see that $Q$ is uniformly distributed on $(0,p)$ and its expectation is therefore $p/2$. Hence the expectation of the profit $R$ is $1.5p/2 - p$.

Case (2) the probability of the price being less than the value is $$P(p < V) = 1-p$$ and the profit $R$ in this case has an expectation of 0 because we're neither buying nor selling the car.

Hence by the law of total expectation, the expected profit is given by the profit in each case times the probability of the case $$ \begin{align} E(R) &= E(R|\mathrm{bought})P(\mathrm{bought}) + E(R|\neg\mathrm{bought})P(\neg\mathrm{bought}) \\ &= (1.5p/2 - p) \times p \;-\; 0 \times (1-p) \\ &= -0.25p^2 \end{align}$$

This is negative so you wouldn't want to make an offer! With a price $d$ in your original units of currency your expected loss would be $0.25(d/1000)^2\times1000 = d^2/4000$.


Re your update, the probabilities look correct. However it is the amount of money you expect to make that is important, not the probability of making it. For example given these two games which one would you play?

  • You have a 90% chance of winning \$1, and a 10% chance of losing \$1million.

  • You have a 10% chance of winning \$1million, and a 90% chance of losing \$1.

I hope you can see from this that the probability of ending up with more money isn't the most important thing: it's how much more money you end up with on average (the expectation) that's important.

In your case, it's easy to see that the expectation associated with cases 1 and 3 is relatively low, because you've included case 1 where no money changes hands. If you assumed that the payoffs are similar with winning and losing, and wanted to look at probabilities alone, you would compare case 2 (buy the car and get less) with case 3 (buy the car and get more), and notice that $$P(\mathrm{case 2/lose}) = m/1500 > P(\mathrm{case 2/win}) = m/3000$$

Related Question