[Math] Coin flipping games – dependent trials

probabilitysequences-and-series

I'm still trying to learn probability and in furtherance of this I have posed myself two questions about coin flipping series. I don't know how to answer these questions because these aren't independent trials of flips. I have monte-carloed these trials and so I know what the answers are, but I don't have analytical solutions.

Consider the following "games":

Easy: A fair coin is flipped until it lands on tails. Each time the coin is flipped, the player receives one point. What is the expected number of points the player will receive in a game?

Harder: An unfair coin is flipped until it lands on tails. The coin lands on tails with probability p. (p is an independent variable – it does not change throughout the series). Each time that the coin is flipped, the player receives twice as many points as the last time that the coin was flipped. E.G., The first flip gives 1 point, the second 2, the third 4. What is the expected number of points the player will receive in a game?

Since the purpose of answering these questions is to learn the underlying math, I would appreciate any links to relevant reference material. If my terminology is unclear or nonstandard please correct it. Thanks in advance and I hope this question is appropriate to this forum – this is my first post here.

Edit: in answer to Mark's comment, here is what I have.

For the easier game, here's what I've got.

I know how to calculate the probability that the the game ends on flip x.

It is equal to the probability that the game did not end on any previous trial times the probability the game ended on trial $x$.

The probability of getting the first tails on the first flip is $.5$

The probability of getting the first tails on the second flip is the probability of getting a heads on the first flip times the probability of getting a tails on the second flip $= .5\times.5$

The probability of getting the first tails on the third flip is p(heads first) * p(heads second) * p(tails third);

Since the probability of getting a heads or a tails on any particular trial is $.5$ in the first example, I think this reduces nicely to this:

$1 + p^1 + p^2 + p^3 + … + p^x$

So the total EV is the sum of this infinite series (which I don't know how to compute).

For the harder game, I guess it's not really that much harder.

The odds of it ending any particular trial is a similar calculation. If odds of tails is $p$, then odds of success is $1-p$.

so the probability function that tells me the probability that the trial ended at flip $x$ is

$P(x) = (1-p)^{x-1} \times p$

The value of the game is something like

$2^x(P(x))$

summed for all $x$ from $1$ to infinity. Again I don't know to sum that.

Best Answer

First lets answer your first question.

"A fair coin is flipped until it lands on tails. Each time the coin is flipped, the player receives one point. What is the expected number of points the player will receive in a game?"

Let us compute the probability that the player receives exactly $k$ points. This happens when the first $k$ coin flips land on heads and the $(k+1)^{th}$ flip lands on a tail. The probability of this event is hence $$\mathbb{P}(\text{Player receiving $k$ points}) = \underbrace{\frac12 \times \frac12 \times \cdots \frac12}_{\text{The first $k$ flips land on heads}} \times \underbrace{\frac12}_{\text{The $(k+1)^{th}$ flip lands on a tail}} = \frac1{2^{k+1}}$$

Just as a sanity check, we can see that $$\sum_{k=0}^{\infty} \mathbb{P}(\text{Player receiving $k$ points}) = \sum_{k=0}^{\infty} \frac1{2^k} = 1$$

The expected number of points the player wins is hence $$\sum_{k=0}^{\infty} \frac{k}{2^{k+1}} = 1$$

Now lets move on to the second question.

"An unfair coin is flipped until it lands on tails. The coin lands on tails with probability $p$. ($p$ is an independent variable - it does not change throughout the series). Each time that the coin is flipped, the player receives twice as many points as the last time that the coin was flipped. For example, the first flip gives $1$ point, the second $2$, the third $4$. What is the expected number of points the player will receive in a game?"

Note that the player can receive points of the form $$2^{k}-1$$ if the coin flips yield $k$ consecutive heads and the $(k+1)^{th}$ being a tail. Let $q = 1-p$ be the probability of landing on a head.

As before, the probability of the player to get $2^{k} - 1$ points is $q^k \times (1-q)$ i.e.

\begin{align} \mathbb{P}(\text{Player receiving $2^k-1$ points}) & = \mathbb{P}(\text{Heads the first $k$ times and tail the $(k+1)^{th}$ time})\\ & = \underbrace{q \times q \times \cdots q}_{\text{The first $k$ flips land on heads}} \times \underbrace{(1-q)}_{\text{The $(k+1)^{th}$ flip lands on a tail}}\\ & = q^k ( 1-q) \end{align}

Hence, the expected number of points he will win is $$\sum_{k=0}^{\infty} (2^k-1) \times q^k \times (1-q) = \left( \sum_{k=0}^{\infty} (2q)^k (1-q) \right) - 1 = \begin{cases} \frac{q}{1-2q} & \text{if $q < \frac12$ }\\ \infty & \text{else} \end{cases}$$

The fact that the expectation is $\infty$ if $q \geq \frac12$ is the well-known St.Petersburg paradox.

EDIT If you have trouble summing up geometric series, then I would suggest you to look at the following post Value of $\sum\limits_n x^n$ for details on how to evaluate such summations.

Related Question