The sequence of tosses will result in either $H \ldots HT$ or $T \ldots TH,$ where $H \ldots H$ and $T \ldots T$ are (respectively) a sequence of one or more heads and a sequence of one or more tails.
You can compute the conditional expectation directly (by a weighted sum of an infinite number of possible outcomes), or you can use a method like yours to compute the expected number of heads under the condition that you flipped a sequence of zero or more heads followed by a tail, and use that result to evaluate $E[N | \text{first one is a head} ]p + E[N | \text{first one is a tail}](1-p).$
More explicitly, if $E[N']$ is the expected number of heads in a sequence of flips that ends at the first flip that comes up tails, then
$$\begin{eqnarray}
E[N | \text{first one is a head} ] &=& 1 + E[N'], \\
E[N | \text{first one is a tail}] &=& 1.
\end{eqnarray}$$
Edit: For the first equation, if my first flip is heads then the expected total number of heads is the one I just flipped plus the expected additional heads I will flip if I keep flipping until I get tails.
For the second equation, consider the number of $H$s in the sequence $T \ldots TH$ where $T \ldots T$ is a sequence of zero or more $T$s.
Edit: The value $E[N']$ is actually easier to obtain than it might appear from the preceding description. If we say that a coin toss is a "success" if it comes up
tails, then $N'$ is the number of failures before the first success, and for a sequence
of i.i.d. Bernoulli trials (such as our coin tosses) this has a well-known distribution,
the geometric distribution, which has a well-known expected value. Since the probability
of "success" is $q = 1 - p$ in each trial, we have $E[N'] = \frac 1q = \frac{1}{1-p}.$
For part $a)$,
Let $K_1$ denotes when coin $1$ is chosen and $K_2$ denotes when coin $2$ is chosen.
The probability that the coin lands on heads on exactly $7$ of the $10$ flips is $P(H)$
$$P(H)=P(H|K_1)P(K_1)+P(H|K_2)P(K_2)$$
$$P(H)=\dbinom{10}{7}(0.4)^7(0.6)^3(0.5)+\dbinom{10}{7}(0.7)^7(0.3)^3()0.5$$
For part $b)$,
When the first flip is head, then the probability is computed from choosing $6$ head flips out of $9$ flips in both coins case. So, we get $\dbinom{9}{6}(0.4)^6(0.6)^3(0.5)+\dbinom96(0.7)^6(0.3)^3(0.5)$
Let $T$ denotes the event that the first flip is head.
$$P(H|T)=\dfrac{P(H\cap T)}{P(T)}$$
$P(T)=P(T|K_1)P(K_1)+P(T|K_2)P(K_2)=(0.4)(0.5)+(0.7)(0.5)=0.55$
$P(H\cap T)=P(H\cap T|K_1)P(K_1)+P(H\cap T|K_2)P(K_2)$
$$=\dbinom96(0.4)^6(0.6)^3(0.5)+\dbinom96(0.7)^6(0.3)^3(0.5)$$
Best Answer
Using the Chernoff bound (as stated here as (2) for reference):
We want $(1+\gamma)np = \frac{n}{2}$, and since $p=\frac{1}{3}$ this leads to setting $\gamma\stackrel{\rm def}{=} \frac{1}{2}$ in $(\dagger)$. Thus, we have an upper bound on the probability of $$ e^{-\gamma^2 \frac{np}{3}} = e^{-\left(\frac{1}{2}\right)^2 \frac{n}{9}} = e^{-\frac{n}{36}}\,. $$ To make sure the probability is at most $\frac{1}{1000}$, it suffices to choose $n$ such that our upper bound on the probability is less than $\frac{1}{1000}$, i.e. $ e^{-\frac{n}{36}} < \frac{1}{1000} $, or equivalently $$ n > 36\ln 1000 \simeq 248.7\,. $$ Therefore, choosing $\boxed{n = 249}$ is sufficient.