The underlying stochastic recursion might help.
Let $X_N$ denote the number of tosses needed to get $N$ heads in a row. At the time when one first gets $N$ heads in a row, either one gets a new head, and this yields $N+1$ heads in a row, or one gets a tail and then everything starts anew. Thus, for every $N\geqslant0$, one gets the key-identity
$$\color{red}{X_{N+1}=X_N+1+B\bar X_{N+1},\quad B\sim\text{Bernoulli},\quad \bar X_{N+1}\sim X_{N+1}},$$
where $(B,\bar X_{N+1},X_N)$ is independent, $P(B=0)=P(B=1)=\frac12$, $\bar X_{N+1}$ is distributed like $X_{N+1}$, and the correct initialization is $\color{red}{X_0=0}$.
This stochastic recursion fully encodes the distribution of every $X_N$ and it allows to compute recursively their characteristics.
1. Expectations Taking expectations of both sides of the key-identity, one gets
$$
E(X_{N+1})=E(X_N)+1+\tfrac12E(X_{N+1}),
$$
hence
$$
E(X_{N+1})=2E(X_N)+2.
$$
This is solved easily since $$E(X_{N+1})+2=2(E(X_N)+2),
$$
hence
$$
E(X_N)=2^{N}(E(X_0)+2)-2=2\cdot(2^N-1).$$
2. Variances The same representation yields the variances since
$$
X_{N+1}-E(X_{N+1})=X_N-E(X_N)+B\bar X_{N+1}-\tfrac12E(X_{N+1}),$$
hence
$$
\mathrm{var}(X_{N+1})=\mathrm{var}(X_N)+E(Z_N),
$$
where
$$
Z_N=B\bar X_{N+1}^2-BE(X_{N+1})\bar X_{N+1}+\tfrac14E(X_{N+1})^2,
$$
hence
$$
E(Z_N)=\tfrac12E(X_{N+1}^2)-\tfrac12E(X_{N+1})^2+\tfrac14E(X_{N+1})^2,
$$
and
$$
\mathrm{var}(X_{N+1})=2\mathrm{var}(X_N)+\tfrac12E(X_{N+1})^2,
$$
from which (I believe that) one gets (something similar to)
$$
\mathrm{var}(X_N)=2\cdot(2\cdot2^{2N}-2N\cdot2^N-1).
$$
3. Full distributions The key-identity also yields the full distribution of every $X_N$ since, for every $|s|\leqslant1$,
$$
E(s^{X_{N+1}})=E(s^{X_N})\cdot s\cdot E(s^{B\bar X_{N+1}}),
$$
that is,
$$
E(s^{X_{N+1}})=E(s^{X_N})\cdot s\cdot \tfrac12(1+E(s^{X_{N+1}})),
$$
hence
$$
E(s^{X_{N+1}})=\frac{s\cdot E(s^{X_N})}{2-s\cdot E(s^{X_N})}.
$$
This can be rewritten as
$$
\frac1{E(s^{X_{N+1}})}-\frac{s}{2-s}=\frac2s\left(\frac1{E(s^{X_{N}})}-\frac{s}{2-s}\right).
$$
Furthermore, $E(s^{X_0})=1$. Finally,
$$
E(s^{X_N})=\frac{(2-s)s^N}{2^{N+1}(1-s)+s^{N+1}}.
$$
From this point, it is relatively straightforward to show that, for every $t\geqslant0$,
$$
\lim_{N\to\infty}E(\mathrm e^{-tX_N/2^N})=\frac1{1+2t},
$$
which shows that $2^{-N}X_N$ converges in distribution to an exponential random variable of parameter $\frac12$, in symbols,
$$
\color{red}{\frac{X_N}{2^N}\stackrel{\text{dist.}}{\longrightarrow}2\cdot\Xi},\qquad\color{red}{\Xi\sim\mathcal E(1)}.
$$
And, to fully complete this circle of ideas, note that $\Theta=2\cdot\Xi$ solves the identity
$$
\Theta\stackrel{\text{dist.}}{=}\tfrac12\cdot\Theta+B\cdot\bar\Theta,
$$
with the obvious notations, and that the nondegenerate solutions of this identity are the exponential distributions.
I ask you for additional explanation. Meanwhile I'll post here another approach.
Denote by $\tau_i^5$ the random variable that counts the time required to get five heads starting from $i$ heads, ok?
What we want is exactly $E[\tau_0^5]$, right?
Now, you can evaluate $E[\tau_0^5]$ conditioning at the first step.
$$
E[\tau_0^5] = \frac{E[\tau_0^5]}{2} + \frac{E[\tau_1^5]}{2} +1
$$
Explaining the equation above. With probability $1/2$ you have a tail, so the time you will take to get five heads is the same, because you have any heads. On the other hand, with the same probability you get a head, and now, the number of flips needed to get 5 heads is $E[\tau_1^5]$, because now you that you have one head. The +1 it is because you have to count the first step.
Repeating the argument above you get
$$
E[\tau_1^5] = \frac{E[\tau_0^5]}{2} + \frac{E[\tau_2^5]}{2} +1
$$
Proceeding this way, and remembering $E[\tau_5^5]=0$, you get
$$
E[\tau_0^5] = 62
$$
This may seems more complicated at the first sight, but the idea of "to conditionate at what happens at the first time (or movement)" solve a big variety of problems.
Best Answer
On any one flip, the probability that all three coins come up the same is $\frac{2}{8}=\frac14$ and the probability that they don't is $\frac34$. Hence, the probability that the three coins come up the same for the first time at exactly the $n$-th flip is ${\left(\frac34\right)}^{n-1}\frac14$.
The expectation is therefore given by
$$\sum_{n\geq 1}n{\left(\frac34\right)}^{n-1}\frac14=\frac14\cdot\sum_{n\geq 1}n{\left(\frac34\right)}^{n-1}.$$
Now, consider $f(x)=\sum_{n\geq0}x^n=\frac{1}{1-x}$, $|x|<1$. We have that
$$f'(x)=\sum_{n\geq1}nx^{n-1}=\frac{1}{{(1-x)}^2}.$$
It follows that the expectation equals
$$\frac14\cdot f'\left(\frac34\right)=\frac14\cdot\frac{1}{{\left(\frac14\right)}^2}=\frac14\cdot16=4$$
This was checking the whole thing via definitions. But you could have skipped it all noting that the chance of 'success' at each step is $1/4$, so the expected value, naturally, would be $4$.