In the following we use $n$ instead of the longer string $10$. (Number of people.)
The indices $j,k$ will be considered modulo $n$. (So $j\pm1$ is also considered after applying $\pm1$ modulo $n$.) The following works for any $n\ge 6$.
Let $X_k$ be the random variable on $\{0,1\}^n$ which is $1$ if the components $k-1,k,k+1$ are all heads, else $0$.
The computation of $\Bbb E X_k = \frac 1{2^3}= \frac 18$ is ok, so
$$\Bbb E X =\Bbb E\sum_k X_k =\sum_k \Bbb E X_k = \sum_k \frac 18 = \frac n8\ .$$
Now we compute explicitly for some fixed $k$:
$$
\begin{aligned}
\Bbb E X_k^2 &=\frac 1{2^3}\ ,\text{ positions $k-1,k,k+1$ are head,}\\
\Bbb E X_kX_{k\pm 1} &=\frac 1{2^4}\ ,\text{ positions $k-1,k,k+1$ and also $k\pm2$ are head,}\\
\Bbb E X_kX_{k\pm 2} &=\frac 1{2^5}\ ,\text{ positions $k-1,k,k+1$ and also $k\pm2,k\pm 3$ are head,}\\
\Bbb E X_kX_j &=\frac 1{2^6}\ ,\text{ positions $k-1,k,k+1$ and also $j-1,j,j+1$ are head,}
\end{aligned}
$$
the index $j$ being not among the neighbors of distance $\le 2$ to $k$.
So
$$
\begin{aligned}
\Bbb EX^2
&=
\Bbb E \sum_{k,j}X_kX_j\\
&=
\sum_k\sum_j\Bbb E X_kX_j\\
&=\sum_k\left(
\frac 1{2^3}
+\frac 1{2^4}+\frac 1{2^4}
+\frac 1{2^5}+\frac 1{2^5}
+(n-5)\frac 1{2^6}
\right)
\\
&=
\sum_k\frac 1{2^6}(8+4+4+2+2+(n-5))
=
\frac {n(n+15)}{64}\ .
\end{aligned}
$$
So the variation of $X$ is
$$
\sigma^2:=
\operatorname{Var}[X]
= E[X^2]-E[X]^2
=
\frac {n(n+15)}{64}
-
\left(\frac n8\right)^2
=
\frac {15n}{64} \ .
$$
So the standard deviation $\sigma$ is the square root of this number,
a specific constant times $\sqrt n$.
So we apply the inequality of Cebîshev:
$$
\Bbb{P}(\ |X-\Bbb{E}(X)| \geq c \sqrt{n}\ )
=
\Bbb{P}\left(\ |X-\Bbb{E}(X)| \geq c \cdot\frac 8{\sqrt {15}}\sigma\ \right)
\le
\left(\frac {\sqrt{15}}{8c}\right)^2
=\frac {15}{64c}
\ .
$$
For my safe i wanted to verify the above, the following rather simple sage code confirms the results:
for n in [6..12]:
R = [0, 1]
C = cartesian_product( [ R for _ in range(n) ] )
p = 1/2^n # weight of each element in the probability space C
M1 = 0
M2 = 0
for c in C:
count = len( [ k for k in range(n)
if c[k] == 1
and c[(k-1)%n] == 1
and c[(k+1)%n] == 1 ] )
M1 += p * count
M2 += p * count^2
V = M2 - M1^2
print "n = %s" % n
print "\t1. st moment = %s" % M1
print "\t2. nd moment = %s" % M2
print "\tVariation = %s" % V
Results:
n = 6
1. st moment = 3/4
2. nd moment = 63/32
Variation = 45/32
n = 7
1. st moment = 7/8
2. nd moment = 77/32
Variation = 105/64
n = 8
1. st moment = 1
2. nd moment = 23/8
Variation = 15/8
n = 9
1. st moment = 9/8
2. nd moment = 27/8
Variation = 135/64
n = 10
1. st moment = 5/4
2. nd moment = 125/32
Variation = 75/32
n = 11
1. st moment = 11/8
2. nd moment = 143/32
Variation = 165/64
n = 12
1. st moment = 3/2
2. nd moment = 81/16
Variation = 45/16
how is the expectation of the variance equivalent to the variance?
Not exactly. The expectation of the sample variance is equal to the population variance
$$\mathbb{E}[S^2]=\mathbb{E}\left[\frac{1}{n-1} \Sigma_i(X_i-\overline{X}_n)^2 \right]=\frac{1}{n-1}\mathbb{E}\left[ \Sigma_i(X_i-\mu)^2-n(\overline{X}_n-\mu)^2 \right]=$$
$$=\frac{1}{n-1}\left[ \Sigma_i\mathbb{E}(X_i-\mu)^2-n\mathbb{E}(\overline{X}_n-\mu)^2 \right]=\frac{1}{n-1}\left[n\sigma^2-n\mathbb{V}[\overline{X}_n]\right]=$$
$$=\frac{1}{n-1}\left[n\sigma^2-n\frac{\sigma^2}{n}\right]=\sigma^2=\lambda$$
Best Answer
Your formulae are correct. Here is a proof.
If we note $Y$ the result of the coin draw. Heads is denoted by $Y=1$, tails $Y=0$, and let $U \sim U(a,b)$,$Z \sim \exp(\lambda)$. Thus formally :
$$X=\mathbf 1_{Y=1}U+\mathbf 1_{Y=0}Z$$
Thus :
$$E(X)=E(\mathbf 1_{Y=1}U+\mathbf 1_{Y=0}Z)$$
$\mathbf 1_{Y=1}$ and $\mathbf 1_{Y=0}$ are independent of $U$ and $Z$, the outcome of the draw will chose if you look at $U$ or $Z$ but the value of these r.v. are not impacted. Thus by linearity :
$$E(X)=E(\mathbf 1_{Y=1})E(U)+E(\mathbf 1_{Y=0})E(Z)$$
Since $E(\mathbf 1_{Y=1}) =P(Y=1)=p$, we arrive at your formula. For the variance we can extend this reasoning.