Expected number of minutes for two ants to fall off rope

expected valueindependenceprobabilityuniform distribution

This question is based off of this post

Let's say we have a $1$ foot rope and we randomly and uniformly place $2$ ants on it. They each move $1$ ft per second towards a random end and switch direction upon collision. What would the expected length of time be for the last ant to fall off the rope.

Suppose on average the ants will be placed at $\frac13$ and $\frac23$ feet on this rope (since ants are IID). It is with probability $0.5$ that they will go in the same direction in which case it will take $\frac23$ of a minute for the last ant to fall. If the ants travel towards each other (probability $\frac14$) it will take $\frac23$ minutes for them to fall. If the ants move away from each other, it will take $\frac13$ for them to fall.

This gives us $\frac34\cdot\frac23 + \frac14\cdot\frac13 = \frac7{12}$ However, based off the solution I believe the correct answer is $\frac23$. Why is this approach invalid?

Best Answer

There’s no reason why you should be able to replace a random variable by its expected value and expect to obtain a correct result. This only works if the answer depends linearly on the random variable; in that case the linearity of expectation makes this approach work. But here the maximum of two variables isn’t linear in those variables, so there’s no reason why taking the maximum and taking the expected value should commute.

Related Question