The sequence of tosses will result in either $H \ldots HT$ or $T \ldots TH,$ where $H \ldots H$ and $T \ldots T$ are (respectively) a sequence of one or more heads and a sequence of one or more tails.
You can compute the conditional expectation directly (by a weighted sum of an infinite number of possible outcomes), or you can use a method like yours to compute the expected number of heads under the condition that you flipped a sequence of zero or more heads followed by a tail, and use that result to evaluate $E[N | \text{first one is a head} ]p + E[N | \text{first one is a tail}](1-p).$
More explicitly, if $E[N']$ is the expected number of heads in a sequence of flips that ends at the first flip that comes up tails, then
$$\begin{eqnarray}
E[N | \text{first one is a head} ] &=& 1 + E[N'], \\
E[N | \text{first one is a tail}] &=& 1.
\end{eqnarray}$$
Edit: For the first equation, if my first flip is heads then the expected total number of heads is the one I just flipped plus the expected additional heads I will flip if I keep flipping until I get tails.
For the second equation, consider the number of $H$s in the sequence $T \ldots TH$ where $T \ldots T$ is a sequence of zero or more $T$s.
Edit: The value $E[N']$ is actually easier to obtain than it might appear from the preceding description. If we say that a coin toss is a "success" if it comes up
tails, then $N'$ is the number of failures before the first success, and for a sequence
of i.i.d. Bernoulli trials (such as our coin tosses) this has a well-known distribution,
the geometric distribution, which has a well-known expected value. Since the probability
of "success" is $q = 1 - p$ in each trial, we have $E[N'] = \frac 1q = \frac{1}{1-p}.$
In the following we use $n$ instead of the longer string $10$. (Number of people.)
The indices $j,k$ will be considered modulo $n$. (So $j\pm1$ is also considered after applying $\pm1$ modulo $n$.) The following works for any $n\ge 6$.
Let $X_k$ be the random variable on $\{0,1\}^n$ which is $1$ if the components $k-1,k,k+1$ are all heads, else $0$.
The computation of $\Bbb E X_k = \frac 1{2^3}= \frac 18$ is ok, so
$$\Bbb E X =\Bbb E\sum_k X_k =\sum_k \Bbb E X_k = \sum_k \frac 18 = \frac n8\ .$$
Now we compute explicitly for some fixed $k$:
$$
\begin{aligned}
\Bbb E X_k^2 &=\frac 1{2^3}\ ,\text{ positions $k-1,k,k+1$ are head,}\\
\Bbb E X_kX_{k\pm 1} &=\frac 1{2^4}\ ,\text{ positions $k-1,k,k+1$ and also $k\pm2$ are head,}\\
\Bbb E X_kX_{k\pm 2} &=\frac 1{2^5}\ ,\text{ positions $k-1,k,k+1$ and also $k\pm2,k\pm 3$ are head,}\\
\Bbb E X_kX_j &=\frac 1{2^6}\ ,\text{ positions $k-1,k,k+1$ and also $j-1,j,j+1$ are head,}
\end{aligned}
$$
the index $j$ being not among the neighbors of distance $\le 2$ to $k$.
So
$$
\begin{aligned}
\Bbb EX^2
&=
\Bbb E \sum_{k,j}X_kX_j\\
&=
\sum_k\sum_j\Bbb E X_kX_j\\
&=\sum_k\left(
\frac 1{2^3}
+\frac 1{2^4}+\frac 1{2^4}
+\frac 1{2^5}+\frac 1{2^5}
+(n-5)\frac 1{2^6}
\right)
\\
&=
\sum_k\frac 1{2^6}(8+4+4+2+2+(n-5))
=
\frac {n(n+15)}{64}\ .
\end{aligned}
$$
So the variation of $X$ is
$$
\sigma^2:=
\operatorname{Var}[X]
= E[X^2]-E[X]^2
=
\frac {n(n+15)}{64}
-
\left(\frac n8\right)^2
=
\frac {15n}{64} \ .
$$
So the standard deviation $\sigma$ is the square root of this number,
a specific constant times $\sqrt n$.
So we apply the inequality of Cebîshev:
$$
\Bbb{P}(\ |X-\Bbb{E}(X)| \geq c \sqrt{n}\ )
=
\Bbb{P}\left(\ |X-\Bbb{E}(X)| \geq c \cdot\frac 8{\sqrt {15}}\sigma\ \right)
\le
\left(\frac {\sqrt{15}}{8c}\right)^2
=\frac {15}{64c}
\ .
$$
For my safe i wanted to verify the above, the following rather simple sage code confirms the results:
for n in [6..12]:
R = [0, 1]
C = cartesian_product( [ R for _ in range(n) ] )
p = 1/2^n # weight of each element in the probability space C
M1 = 0
M2 = 0
for c in C:
count = len( [ k for k in range(n)
if c[k] == 1
and c[(k-1)%n] == 1
and c[(k+1)%n] == 1 ] )
M1 += p * count
M2 += p * count^2
V = M2 - M1^2
print "n = %s" % n
print "\t1. st moment = %s" % M1
print "\t2. nd moment = %s" % M2
print "\tVariation = %s" % V
Results:
n = 6
1. st moment = 3/4
2. nd moment = 63/32
Variation = 45/32
n = 7
1. st moment = 7/8
2. nd moment = 77/32
Variation = 105/64
n = 8
1. st moment = 1
2. nd moment = 23/8
Variation = 15/8
n = 9
1. st moment = 9/8
2. nd moment = 27/8
Variation = 135/64
n = 10
1. st moment = 5/4
2. nd moment = 125/32
Variation = 75/32
n = 11
1. st moment = 11/8
2. nd moment = 143/32
Variation = 165/64
n = 12
1. st moment = 3/2
2. nd moment = 81/16
Variation = 45/16
Best Answer
If the number of heads is $\geq n/2$, then there is some subset $A \subset \{1,2,\ldots,n\}$ with $|A| = n/2$ such that $i \in A$ implies flip $i$ was a head.
That is, $$\{\#\text{heads} \geq n/2\} \subseteq \bigcup_{|A| = n/2} \{\forall i \in A, \text{ flip } i \text{ is a head}\}$$
By monotonicity and subadditivity of $P$, we have $$P[\#\text{heads} \geq n/2] \leq \sum_{|A|=n/2}P[\forall i \in A, \text{ flip } i \text{ is a head}] = \sum_{|A|=n/2} p^{n/2} = \binom{n}{n/2}p^{n/2}$$