Conditional expectation with multiple random variables

bayes-theoremconditional probabilityconditional-expectationprobabilityprobability theory

I'm trying to understand the solution to a problem from A First Course in Probability (Ross):

There are two misshapen coins in a box; their probabilities for landing on heads when they are flipped are, respectively, .4 and .7. One of the coins is to be randomly chosen and flipped 10 times. Given that two of the first three flips landed on heads, what is the conditional expected number of heads in the 10 flips?

Here is the solution provided in the book:
enter image description here

Intuitively, the solution makes sense to me. We want to use the information that two of the first three flips were heads to update our beliefs of which coin was chosen. However, I don't understand what's happening under the hood. IIUC, $N_7, C$ are random variables, but is $T$ a RV as well given we know that it occurred. It seems like the underlying formula used by the solution is:

(1) $E[N_7|T] = E[E[N_7|T, C]]$, where the outer expectation is over $C$, is that correct?

If so, I would have expected (1) to decompose into:

$E[N_7|T] = \sum_c E[N_7|T, C=c]p(c)$ but this seems different that what the solution is proposing.

I think I'm overlooking/missing some key properties of expectations/RVs/bayes theorem here. Any insight would be appreciated, thanks!

Best Answer

It's the total law of probability: take the partition $\{C_1,C_2\}$ then

$$ \mathbb{E}[N_7|T] = \sum_{i=1}^{2} \mathbb{E}[N_7|T \cap C_i] \mathbb{P}[C_1|T]$$