This actually has nothing to do with Bayes' theorem. Let the probability that the $n$th traffic light is red be $a_n$. Then we have $a_1 = a$. To obtain a recursive relation for $a_n$, we have two cases: either the previous light is red (probability $a_{n-1}$, and then probability $p$ that the $n$th light is also red), or the previous light is green (probability $1-a_{n-1}$, and then probability $1-p$ that the $n$th light changes to red). This yields $$a_n = pa_{n-1} + (1-p)(1-a_{n-1}).$$
We now have a recursive relation which we can use to solve for $a_3$ by working up from $a_1$.
Note: your problem is an example of what's called a Markov Chain. You can read more about it here.
The flaw in your second answer is that it is not necessarily the same die that is thrown after each coin toss. The way you have written the law of total probability is acceptable in your first answer, but not in the second, because it is only in the second answer that you conditioned on the events $A$ and $B$, which represent the outcomes of the coin toss. Consequently, the result is incorrect because it corresponds to a model in which the coin is tossed once, and then the corresponding die is rolled three times.
First, let us do the calculation the proper way. We want $$\Pr[R_3 \mid R_1, R_2] = \frac{\Pr[R_1, R_2, R_3]}{\Pr[R_1, R_2]}$$ as you wrote above. Now we must condition on all possible outcomes of the coin tosses, of which there are eight:
$$\begin{align}
\Pr[R_1, R_2, R_3] &= \Pr[R_1, R_2, R_3 \mid A_1, A_2, A_3]\Pr[A_1, A_2, A_3] \\ &+ \Pr[R_1, R_2, R_3 \mid A_1, A_2, B_3]\Pr[A_1, A_2, B_3] \\
&+ \Pr[R_1, R_2, R_3 \mid A_1, B_2, A_3]\Pr[A_1, B_2, A_3] \\ &+ \Pr[R_1, R_2, R_3 \mid A_1, B_2, B_3]\Pr[A_1, B_2, B_3] \\
&+\Pr[R_1, R_2, R_3 \mid B_1, A_2, A_3]\Pr[B_1, A_2, A_3] \\ &+ \Pr[R_1, R_2, R_3 \mid B_1, A_2, B_3]\Pr[B_1, A_2, B_3] \\
&+\Pr[R_1, R_2, R_3 \mid B_1, B_2, A_3]\Pr[B_1, B_2, A_3] \\ &+ \Pr[R_1, R_2, R_3 \mid B_1, B_2, B_3]\Pr[B_1, B_2, B_3] \\
\end{align}$$
and since each of the $2^3 = 8$ triplets of ordered coin tosses has equal probability of $1/8$ of occurring,
$$\Pr[R_1, R_2, R_3] = \tfrac{1}{8}\left((\tfrac{2}{3})^3 + 3(\tfrac{2}{3})^2(\tfrac{1}{3}) + 3(\tfrac{2}{3})(\tfrac{1}{3})^2 + (\tfrac{1}{3})^3\right)
= \tfrac{1}{8}.$$ A similar (but simpler) calculation for the denominator yields $1/4$, and the result follows.
Of course, none of this is necessary; it is only shown here to illustrate how the calculation would be done if it were to be done along the lines of your second answer.
Best Answer
Perhaps you are familiar with the notion of a probability tree, but just in case you aren't (or for anyone else reading this), I will include a brief description. If several random variables exist and we know something about their conditional probabilities, we can compute probabilities of events by a tree. In this case, we have three random variables ($C_1$, the color of the first light, $C_2$, the color of the second, $C_3$, the color of the third). We make a probability tree with four levels (where the initial level is just a trivial node, the root node). Since $C_1$ can either be red or green, there are two branches away from the root node onto the nodes $R_1$, $G_1$ on the second level. Traveling down a branch means we are realizing a particular color of the first light.
From each of those nodes, we draw two further branches to two successor nodes ($R_2$ and $G_2$). Traveling down one of those nodes is realizing a color of the second light. Similarly, each node on the third level has two successors ($R_3$ and $G_3$) on the fourth level, corresponding to one of the two values of the third light.
Each possible outcome corresponds to a path from the root node to one of the terminal nodes. The path from the top to the bottom using $R_1, R_2, R_3$ corresponds to the outcome $R_1\cap R_2\cap R_3$. Between the first and second levels of the trees, we put the probabilities of the two event $R_1$ and $G_1$ on the branches. The branch to $R_1$ gets probability $a$ (this is given), and the complement $G_1$ gets $1-a$.
For the other levels of the tree, we label the branch with the conditional probability of the lower node given the upper node. For example, the branch from $R_1$ to $R_2$ is labeled with $P(R_2|R_1)=p$.
For a particular outcome (say $R_1\cap R_2\cap R_3$), we look at the path from the top of the tree to the bottom going through the nodes $R_1, R_2, R_3$ and multiplying the numbers on the branches of the path (which are $a, p,p$). Thus $P(R_1\cap R_2\cap R_3)=ap^2$.
Now in our tree, four of the nodes at the bottom are $G_3$, so we need to find the probability of each and add. These four nodes at the bottom labeled $G_3$ correspond to the four exclusive events $R_1\cap R_2\cap G_3$, $R_1\cap G_2\cap G_3$, $G_1\cap R_2\cap G_3$, $G_1\cap G_2\cap G_3$ (and each event corresponds to the nodes used in a path from the top to the bottom for that event). That is, the first of the four nodes labeled $G_3$ is the end result of the path using the nodes $R_1, R_2$, and $G_3$, and so that corresponds to the outcome $R_1\cap R_2\cap G_3$.
So what we should do is for each of the four nodes labeled $G_3$, take the path from the top to that node and multiply the weights on those branches in the path. For the first node going through the nodes $R_1, R_2, G_3$, the weights on the edges down that path are $a, p, 1-p$, and so $P(R_1\cap R_2\cap G_3)=ap(1-p)$. Do the same thing for the other three nodes labeled $G_3$ and add all the results to get $P(G_3)$.