Tossing Coin Expected Value Against Intuition

expected valueintuitionprobability

I have the following two coin toss games:

Game 1:
A and B tosses a coin. At first the coin is unbiased. Through the game, if heads comes A wins and game stops. If tail comes, the coin is swapped with another coin which is biased such that the probability of heads showing up becomes 1/3 and game continues. For kth turn the probability of heads showing up is 1/(k+1), if tails comes the coin is swapped with another biased coin for which probability of heads showing up becomes 1/(k+2). What is the expected value of number of turns in which the game ends ie. heads shows up.

Game 2:
The game is same, but the biases of coins are different. This time, for kth turn, if probability of heads is 1/(2^k), the next turn it is 1/(2^(k+1)). So it goes 1/2, 1/4, 1/8, 1/16, … .What is the expected value of number of turns in which the game ends.

After calculating I found the expected value in game 1 as divergent, and in game 2 as smaller than 2 which feels very counter-intuitive since A seems to be have a greater chance of winning in game 1.

So my question is what does a divergent expected value actually mean and do you think I made a mistake with the calculations or does the problem really act against intuition.

Best Answer

Some quick calculations, to showcase (presumably) what the OP has done.

For the first game, the probability that the game ends on the $k$th turn is $\frac{(k-1)!}{(k+1)!}=\frac{1}{k(k+1)}$. Thus when $X$ is the number of turns, $$E(X)=\sum_{i=1}^\infty\frac{1}{i+1}=\infty$$ which is known to diverge.

For game two, the probability that the game ends on the $k$th turn is strictly less than $\frac{1}{2^k}$ (well, equal for $k=1$, but strictly less otherwise). Thus when $X$ is the number of turns, $$E(X)<\sum_{i=1}^\infty \frac{i}{2^k}=2$$

as you said.

Now why is this?

Well, think about what we assumed. We assumed that the game would end. In 1, the probability that the game lasts at least $k$ moves is $\frac{1}{k}$ so the game has to end. But for game 2, the probability that the game lasts forever is...

$$\prod_{i=1}^\infty\frac{2^i-1}{2^i}\approx 0.288$$

(Oh and an OEIS for this product too, because why not?)

Thus, the second game actually has a chance that you'll never finish! What does this tell you about the calculated expected probability?

Related Question