Expectation of a conditioned geometric random variable

expected valueprobabilityrandom variablesstatistics

So first let $X$ be the number of independent coin tosses until first head; $P(H)=p$. Suppose we have observed that the first coin toss is tails, what is the expectation $E[X-1|X>1]$?

So intuitively this is just equal to $E[X]$ which makes sense to me. I tried to derive this more formally:

$E[X-1|X>1]=\sum_xg(x)p_{X-1|X>1}(x)=\sum_x(x-1)p_{X-1|X>1}(x)$

Since a geometric random variable is memoryless, we have:

$E[X-1|X>1]=\sum_x(x-1)p_{x}(x)=E[X]-1$

I seem to be off by a factor of 1, I suspect it might be something to do with the fact that I am still summing over all $x$ as denoted in $\sum_x$ but I am not quite sure how to fix this.

Best Answer

It seems to me that the issue here is in applying the law of the unconscious statistician with a mismatch in the random variable. Namely, conditioned on $X>1$ the random variable $Y := X-1$ has the same distribution as $X$. This means that $p_Y(k) = p_X(k)$ for all $k\geq 1$, indeed. So $$ \mathbb{E}[X-1\mid X>1] = \mathbb{E}[Y] = \sum_{k=1}^\infty k p_Y(k) = \sum_{k=1}^\infty k p_X(k) = \mathbb{E}[X]\,. $$ Howver, what you wrote is a bit different: what you wrote is $$ \mathbb{E}[X-1\mid X>1] = \sum_{k=1}^\infty (k-1)p_Y(k) $$ which is different. The function $g$ you applied the LOTUS to seems OK, but then you ended up with $p_{X-1|X>1}$ (i.e., $p_{g(X)|X>1}$) instead of $p_{X|X>1}$ (this does not sound very formal as written, but that appears to be the crux of the issue).