Expectation of a geometric random variable conditioned on the first Tosses outcome

conditional-expectationexpected valueprobability

I'm working through Blitzstein and Hwang's Intro to probability.

There is an example (9.1.8) in the conditional expectation chapter.
$X$ is a geometric Random Variable tracking the first Heads in a number of coin tosses.
They use total expectation to get $E[X]$ by conditioning on the outcome of the first Toss.

$$ E[X] = E[X|\text{first toss Heads}]p + E[X|\text{first toss Tails}](1-p)$$
which results in
$$E[X] = \frac{1-p}{p}$$

the calculation makes sense, but I'm confused by the result.
Why doesn't $E[X]$ stay at $\frac{1}{p} $ the expectation of a geometric random variable?

Thank you!

Link to the book above is made freely available by the authors.

Best Answer

Thanks for reading my book! The issue is that, unfortunately, there are 4 different conventions on how the Geometric distribution is defined: number of failures before the first success, number of trials until the first success (including the first success), number of successes before the first failure, number of trials until the first failure (including the first failure).

In the book we define the Geometric as the number of failures before the first success (so the support starts at $0$), and the First Success as the number of trials until the first success (including the first success, so the support starts at $1$). Then a Geom($p$) r.v. has mean $\frac{1-p}{p}$ and an FS($p$) r.v. has mean $\frac{1-p}{p}+ 1 = \frac{1}{p}$.

Related Question