[Math] Expected value of a biased coin toss

probability

Please help me to calculate expected value.

Consider a biased coins such that the probability for tails is p and the probability for heads is 1-p. Coin tossing continued until the coin shows heads. When you toss a coin, you pay 1.

H: $1-p$

TH: $2p(1-p)$

TTH: $3p^2(1-p)$

. .

. .

. .

In sum, $E[X]=1+p^2+P^3+…..-np^{n-1} = \frac{1-p^n}{1-p}-np^{n-1}$.

Since $p^{n-1}$ reaches zero exponentially, $-np^{n-1}$ can be neglected.

$E[X]= \frac{1-p^n}{1-p}$

I am not sure my derivation is correct or not. But, it seems this cases has a unique name who invented this game such as Bernoulli trial. Is anybody know the name of this game and how to drive the expected value?

Thank you in advance.

Best Answer

Okay. Let's see.

$X$ is the count of tosses until the first head.   This is a random variable of some distribution.   I wonder what?

We have that $\mathsf P(X=x) = p^{x-1}(1-p)$ where $p$ is the probability of getting a tail on any particular toss, for $x\in \{1, \ldots, \infty\}$

Then $\mathsf E(X) = (1-p)\sum_{x=1}^\infty p^{x-1}$ , and that series is a geometric series.   (This may be a clue as to the name of our mystery distribution. Hmmm.)

$$\mathsf E(X) = \frac 1 {1-p}$$


An alternative derivation is to use the Law of Iterated Expectation (aka the Law of Total Expectation), and partition on the result of the next toss. If we get a tail, we're up one count and the experiment repeats; otherwise experiment ends and the count is $1$.   So the expectation is recursively defined:

$$\mathsf E(X) = p \cdot (1+\mathsf E(X)) + (1-p)\cdot 1 \\[2ex] \therefore \mathsf E(X) = \frac 1{1-p}$$