Calculating Upper Bounds for Probability when Extrapolating

expected valueprobabilityprobability theoryprobability-limit-theorems

This question is from the MIT 6.042J 2005 Final exam.
The problem asks us to find an upper bound on the probability that a program runtime will be $\geq60$ seconds given that the expected runtime is 10 seconds. I know the answer to this is by using Markov's Bound:
$$
\Pr(T\geq 60) \leq {E[T] \over 60} = \frac 16
$$

The follow up part then asks this:

"Suppose I decide to run the algorithm for 1 minute and if I don’t get an answer by that time, I stop what I am doing, and completely restart from scratch. Each time that I stop and restart the algorithm gives me an independent run of the algorithm. So, what is an upper bound on the probability that my algorithm takes longer than 5 minutes to get an answer?"

I'm not sure how to solve this part. The answer is $1 \over6^{5}$, which I'm guessing has to do with the extension of Markov's Theorem:
$$
\Pr(T\geq x) \leq {E[T^k] \over x^k}
$$

However, I'm not sure if this is correct, and I'm also not clear on how to use this formula in this context. In particular, my understanding is that we are trying to extrapolate the probability that runtime will be greater than $5 \times 60 = 300$ seconds given that the experiments are only going to 60 seconds, and this is something I haven't encountered before.

Would anyone have an explanation for this question?

Best Answer

It takes longer than 5 minutes if and only if your first 5 independent trials all last longer than 60 seconds. Each trial lasts longer than 60 seconds with probability $\le 1/6$, so the probability that all 5 have to do this is $\le (1/6)^5$.

Related Question