There's a nice discussion of Penney's game in Section 8.4 of Concrete Mathematics. Using the techniques described there, the answers I get (confirming joriki's) are
(a) $(1 + p + p^2)q$, which is $0.657$ when $p = 0.7$, $q = 0.3$.
(b) $\frac{1-pq}{2-p}$, which is $\frac{79}{130} \approx 0.608$ when $p = 0.7, q = 0.3$.
(c) $\frac{p}{1+pq}$, which is $\frac{70}{121} \approx 0.579$ when $p = 0.7, q = 0.3$.
I'll work through part (b) to show you how the techniques in Concrete Mathematics work.
Suppose Player A chooses HTH and Player B chooses THH. Let $S_A$ be the sum of the winning configurations for Player A, so that
$$S_A = \text{HTH + HHTH + THTH + HHHTH + HTHTH + TTHTH} + \cdots$$
Similarly, the sum of the winning configurations for Player B is
$$S_B = \text{THH + TTHH + HTTHH + TTTHH} + \cdots$$
One advantage of doing this is that if we let $H = 0.7$ and $T = 0.3$ in these two equations $S_A$ and $S_B$ give the probabilities that Player A and Player B win, respectively.
Then, let $N$ denote the sum of the sequences in which neither player has won so far:
$$N = 1 + \text{H + T + HH + HT + TH + TT + HHH + HHT + HTT + THT + TTH + TTT} + \cdots$$
Now, we look for a set of equations relating $S_A, S_B,$ and $N$.
First, we can write the sum of all configurations in two different ways, so they must be equal:
$$1 + N(\text{H + T}) = N + S_A + S_B.$$
Adding HTH to any configuration in $N$ results in a win for $A$, a win for $A$ followed by TH, or a win for $B$ followed by a TH, so
$$N \text{ HTH} = S_A + S_A \text{ TH} + S_B \text{ TH}.$$
Finally, adding THH to a configuration in $N$ results in a win for $A$ followed by an H or a win for $B$,
so we have $$N \text{ THH} = S_A \text{ H} + S_B.$$
Letting $H = p$ and $T = q$ and solving the last three equations, I get $S_A = \frac{1-pq}{2-p}$ and $S_B = \frac{1-p+pq}{2-p}$. With $p = 0.7$, this yields $S_A = \frac{79}{130} \approx 0.608$ and $S_B = \frac{51}{130} \approx 0.392$.
For another example of the use of this technique, see a question related to two competing patterns in coin tossing.
Consider a sequence of $n$ coin tosses.
Let $X_i = 1$ if tosses numbered $i$ through $i+2$ are HTT,
$X_i = 0$ otherwise.
Then $E(X_i) = \frac18.$
Moreover, although $X_i$ and $X_{i+1}$ are not independent, the expectation of their sum is still the sum of their expectations.
We can extend this to show that
$$E(X_1 + \cdots + X_{n-2}) = E(X_1) + \cdots + E(X_{n-2}) = \frac{n-2}{8}.$$
As $n\to\infty,$ in the limit $\frac18$ of the three-toss sequences are HTT,
and each new coin toss adds a three-toss sequence to the list, hence the mean waiting time between HTTs is $8$ coin tosses.
The expected waiting time between HTTs is $8$ despite the fact that after observing HTT, the next two coin tosses have zero probability of producing another HTT. Hence the expected waiting time after no coin tosses is also $8.$
To make this argument formal we might need to get into something like your Markov chain. But as long as we consider it "obvious" that the mean waiting time exists (perhaps because we know we can turn this into a Markov chain), I think it is also "obvious" that the mean is $8$ when there have been at least two previous coin tosses.
Note that the argument presented above does not apply to every three-letter sequence of H and T, but only to those sequences $\sigma$ for which the statement "after observing $\sigma,$ the next two coin tosses have zero probability of producing another $\sigma$" is true.
In other cases, the argument must be modified to take into account
the possibility of overlapping observations of the sequence.
For example, after observing HTH there is a $\frac14$ probability of observing HTH again after exactly $2$ more flips. The expected waiting time between repetitions of HTH is therefore $8 = \frac14\cdot2 + \frac34 W_{HTH},$ where $W_{HTH}$ is the expected waiting time for HTH starting from no previous coin tosses, and therefore $W_{HTH} = 10.$
For HHH, there is a $\frac12$ chance to observe the sequence again after just $1$ toss, and a $\frac12$ chance that the next toss will be T, after which the expected waiting time for the next HHH is $W_{HHH},$ the same as the expected waiting time with no previous tosses, so the expected time between repetitions is $8 = \frac12\cdot1 + \frac12(1+W_{HHH}),$
from which we get $W_{HHH} = 14.$
Best Answer
I think you are just missing what the book means by a "run". E.g.
HHHHHHHHHHHHTTTTTTHHHHHTTTTTT
has 4 runs: a run of heads, followed by a run of tails, followed by a run of heads, followed by a run of tails. All the H's or T's that are next to each other get lumped together to form a single "run", regardless of how many there are. Thus HHH and TTT have 1 run, HTT, HHT, THH, TTH have 2 runs, and HTH and THT have 3 runs.
Thus $P(X=1) = 2/8, P(X=2) = 4/8, P(X=3) = 2/8$ is the distribution of $X$.