[Math] Expected value of a geometric distribution with first step analysis.

expectationprobabilityprobability distributions

I am trying to understand the "story proof" found in this lecture.

I am a bit confused as how the expected value of a random variable differs from the the random variable itself when considering indicator functions.

Say there is a geometric distribution. $X$ counts the number of the failures before the first success, and $E(X)$ is the expected number of failures.

Now I want to compute the expected value given $p$, the probability of success, and $q$ otherwise.
Let $c=E(X)$.

Now I do first-step analysis,
$$c=0\cdot p+(1+c)q.$$

In this step, I don't understand the coefficient of $q$. In $(1+c)$, $1$ makes sense but why $c$?
When computing the expected value: it is the $kp^kq^k$. So the coefficient $k$ is the value of the random variable. But in my example it is the expected value $c$ at the next step which confuses me.

Best Answer

Let $c=E(X)$. We use a conditional expectation argument, conditioning on the result of the first trial.

Either (i) we have a success on the first trial, or (ii) we have a failure on the first trial.

Case (i) has probability $p$.

Case (ii) has probability $q=1-p$. In that case, we have had a failure, and the expected number of additional failures before the first success is $c$. So the conditional expectation of $X$, given that there was a failure on the first trial, is $1+c$.

By the Law of Total Expectation, we therefore have $$E(X)=c=(p)(0)+(q)(1+c).$$ Finally, solve this linear equation for $c$. We get $c(1-q)=q$, and therefore $c=\frac{q}{1-q}=\frac{q}{p}=\frac{1}{p}-1$.

Related Question