I couldn't follow your derivation, but I believe the correct way to solve the problem is as follows.
First, some notation: take $x_i$ to be the expected number of flips to reach the end state ($n$ consecutive heads or $n$ consecutive tails), given that we're on a "run" of exactly $i$ consecutive flips of heads. Similarly, take $y_i$ to be the expected number of flips to reach the end state, given that we're on a run of exactly $i$ consecutive flips of tails. Clearly, $x_n = y_n = 0$, since if we've already had $n$ consecutive heads or tails, we're done.
Then, we're interested in finding $x_0$ ($= y_0$). The relevant equations are (for $i < n$):
\begin{equation}
\begin{split}
x_i & = (1-p)(1 + y_1) + p(1 + x_{i+1}) \\
y_i & = p(1 + x_1) + (1-p)(1 + y_{i+1})
\end{split}
\end{equation}
Let's derive the first equation (the second one is analogous). Suppose we're on a run of $i$ heads. We want to find the expected number of flips to end the game, which is $x_i$. Let's take our next flip. There's a probability $p$ that it comes up heads, which means we now have a run of $i+1$ heads. That's the second term on the right-hand-side of the equation: there's a probability of $p$ of moving from the state with $i$ consecutive heads to the state with $i+1$ consecutive heads, and it costs 1 flip in order to do so. There's also a probability $1-p$ that it comes up tails, in which case we have a run of only 1 tail. That's the first term on the right-hand-side of the equation: there's a probability of $1-p$ of moving from the state with $i$ consecutive heads to the state with 1 consecutive tail, and it costs us 1 flip in order to do so.
With these equations you can solve the problem for any $n$. I tried implementing this in Mathematica, and if $p = 0$ or $p = 1$, I indeed get that $x_0 = y_0 = n$, as expected. I'm not sure if there's any closed form solution to the equations for arbitrary $n$; that's probably a question for the Math StackExchange. (Edit: see derivation below for the solution for arbitrary $n$).
What I've sketched out here is a general approach for solving many types of coin flipping problems (and other problems). It essentially constructs a Markov chain that has states (e.g., 3 consecutive heads), and transition probabilities for moving between states (e.g., a probability of $1-p$ of moving from 3 consecutive heads to 1 tail). The absorbing states are the end states of the game. See, for instance, the answer to this question.
EDIT: For the particular case of $n=5$, I get (using Mathematica):
\begin{equation}
x_0 = \frac{(5 - 10p + 10p^2 - 5p^3 + p^4)(1 + p + p^2 + p^3 + p^4)}{1 - 4p + 6p^2 - 4p^3 + p^4 + 4p^5 - 6p^6 + 4p^7 - p^8}
\end{equation}
which seems to have the right limits (although it's possible I have a typo somewhere).
Edit #2: Following @whuber's advice, I solved the recurrence relation for general $n$. The solution is:
\begin{equation}
\begin{bmatrix}
x_0 \\ y_0
\end{bmatrix}
=
(A + B)[(I-A^{n-1})^{-1}(I-A)-B]^{-1}c + c
\end{equation}
where
\begin{equation}
A =
\begin{bmatrix}
p & 0 \\
0 & 1-p
\end{bmatrix}
\end{equation}
and
\begin{equation}
B =
\begin{bmatrix}
0 & 1-p \\
p & 0
\end{bmatrix}
\end{equation}
and
\begin{equation}
c =
\begin{bmatrix}
1 \\ 1
\end{bmatrix}
\end{equation}
and $I$ is the identity matrix.
For $n=5$, this gives the same result as above.
In your solution, you calculate the probability of getting tails twice as the square of the probability of getting tails on the first flip. This assumes that consecutive flips are independent. In fact, they are not, as the same coin is used throughout and so getting tails on the first throw means it is more likely to get tails on consecutive throws. This is because the coin landing tails on the first throw is more indicative of the case where you are flipping the fair coin. Your result is the correct solution for a case where the coin is reselected after the first flip. Instead, for the case where the same coin is used repeatedly, define:
$$H_2: \text{event that the coin comes up heads two times in a row.}$$
You can calculate the probability of this event simply as follows:
$$
\begin{align}
P(H_2) &= P(H_2 \cap C1) + P(H_2 \cap C2) \\
&= P(H_2 | C1)P(C1) + P(H_2 | C2)P(C2) \\
&= .5^2 * .5 + .1^2 * .5 \\
&= .13
\end{align}
$$
Best Answer
For a fixed value of $n$, I'm not sure that a simpler formula exists. However, for $n=\infty$, the probability that there will be more heads than tails permanently is simply equal to $2p-1$. We can show this as follows.
Let $x_k$ denote the probability of success conditional on the current state being $k$ heads. We have the simple recurrence $x_k=p x_{k+1}+(1-p)x_{k-1}$, which we can rewrite as $$x_k=\frac{1}{p}x_{k-1} - \frac{1-p}{p}x_{k-2}.$$ Solutions to this recurrence have the form $x_k = a_1\lambda_1^k+a_2\lambda_2^k$, where the $\lambda_i$ are roots of the characteristic polynomial $z^2-\frac{1}{p}z+\frac{1-p}{p}$: \begin{align} \lambda_1=1, ~~~~~~\lambda_2=\frac{1-p}{p}, \end{align} We also have the constraints that $x_0=0$ and for $p>1/2$, $\lim_{k\rightarrow \infty}x_k=1$ (see below). It follows that the only possible solution is $$x_k=1-\left(\frac{1-p}{p}\right)^k.$$ Since we start with $0$ heads, then in order to survive forever, we must toss a head in the first toss, and then survive from the state of one head. The probability is therefore $$px_1=p\left(1-\frac{1-p}{p}\right)=p-(1-p)=2p-1.$$
Now the answer to the question "how do we know that $\lim_{k\rightarrow \infty}x_k=1$?" There are two ways to go about this. The first is to start from the Borel-Cantelli lemma as in this answer. But there is also a direct proof.
Let $x_{k,n}$ denote the probability that, starting from a state of $k$ heads, we survive at least $n$ more rounds. Note that $x_{k,0}=1$ for $k>0$, and $x_{0,0}=0$. Note also that the $x_{k,n}$ must satisfy $$x_{k,n}=p x_{k+1,n-1}+(1-p)x_{k-1,n-1}.$$ Since our values of $x_k$ computed above satisfy this same recurrence, then by monotinicity we may deduce that $$\left(x_{k,n-1}\geq x_k \forall k\right)\implies \left(x_{k,n}\geq x_k \forall k\right),$$ and since this inequality holds for $n=0$, then by induction it must hold for all $n$. From this it follows that $\lim_{n\rightarrow\infty}x_{k,n}\geq x_k$ for all $k$, which in turn implies that $\lim_{k\rightarrow \infty}\lim_{n\rightarrow\infty}x_{k,n}=1$. This in turn implies that $\lim_{n\rightarrow\infty}x_{k,n}=x_k$ for all $k$, as deduced above.