What you are suggesting is an iterative refinement algorithm for finding the root of a polynomial. The concept is that we find the remainder, or simply the value of $p(a)$, and then wish to adjust $a$ so that the result is closer to the true root.
First, I will point out that knowing the value of $p(a)$ is insufficient, since it does not given any information concerning which direction or by how much you should adjust the estimate of your root. There are several techniques for deducing this, the simplest of the two being:
Newton's method (requires calculus): Compute the derivative (slope) of $p$ at $a$, notated $p'(a)$. The resulting formula for the next estimate is given by $a-[p(a)/p'(a)]$.
Secant method (no calculus): Add another point $(b,p(b))$ and estimate the derivative using the slope between this point and $(a,p(a))$. The formula is the same as Newton's method except $[p(b)-p(a)]/(b-a)$ is used instead of $p'(a)$.
Intuitively you can understand your method as the case of approximating $p'(a)$ by $1$, so how well it works near the root depends on how well it estimates the real derivative. For example, if the slope is negative at the root your method actually goes in the opposite direction of the root. In fact a sufficient condition for convergence with your method is $0<p'(a)<2$ at the root.
Additional remark: It should not be expected the aforementioned methods above will converge from everywhere, even if there exists a real root. To guarantee convergence you either have to start close to the root (sometimes very close) or use a bracketing method, such as bisection.
Consider, for instance, that you have a degree-7 polynomial $f$, and you want to evaluate it at every 8th root of unity (primitive and non-primitive).
Naively, this takes 28 multiplications and 7 additions per evaluation, for a total of 308 operations.
However, we can do much better. Split $f$ into a sum
$$
f(x)=g_e(x^2)+xg_o(x^2)
$$
for two degree-3 polynomials $g_e$ and $g_o$. Now if we can evaluate $g_e(x^2)$ and $xg_o(x^2)$ for only the first four $8$th roots (1, as well as the ones with positive imaginary part), then because of the even-ness / odd-ness of the two functions, evaluating $f$ at the remaining four roots come basically for free.
In total (including the squaring $x^2$), with naive polynomial evaluation, this gives 15 multiplications and 7 additions for each of the first four roots, and just a single subtraction for each of the other four. A total of 76 operations rather than 308. However, we aren't done with the optimisations.
Now for the really cool part: evaluating $g_e(x^2)$ and $g_o(x^2)$ for the first four eighth roots beginners evaluating $g_e(x)$ and $g_o(x)$ for all the fourth roots of unity. So exactly the same trick can be applied again! So it turns out that the above paragraph is very pessimistic; it takes much less than 15 multiplications and 7 additions for each of the first four eighth roots.
Even if you, from the start, are clever about how you evaluate $f$ by, say, using your calculated $x^4$ when you find $x^5$ (the reduction from 28+7 to 15+7 operations is of this nature), the savings are considerable, as each step of recursion essentially halves the number of points you need to evaluate (more than a subtraction) at.
Consider, for instance, how this method would calculate its last point, $f(e^{7\pi i/4})$. It would be found as $g_e(e^{14\pi i/4})+e^{7\pi i/4}g_o(e^{14\pi i/2})$. However, we have $e^{14\pi i/4}=e^{6\pi i/4}$, which means that because we previously calculated $$f(e^{3\pi i/4}) = g_e(e^{6\pi i/4}) + e^{3\pi i/4}g_o(e^{6 \pi i/4})$$ we got almost everything we need, and get $$f(e^{7\pi i/4}) = g_e(e^{6\pi i/4}) - e^{3\pi i/4}g_o(e^{6 \pi i/4})$$This is what I mean by $f(e^{7\pi i/4})$ costing only a single subtraction.
Best Answer
Split your polynomial into odd and even parts.
The even part is $$f_E(x) = a_0 + a_2x^2 + a_4x^4 + \cdots$$ which you can consider a polynomial in $x^2$ and evaluate by $n/2$ steps of Horner's method.
The odd part is $$ f_O(x) = a_1x + a_3x^3+a_5x^5 + \cdots = x(a_1+a_3x^2 + a_5x^4+\cdots) $$ where the bracket on the right is again a polynomial in $x^2$ that you can evaluate in $n/2$ steps. Thereafter it takes only a single multiplication to multiply by $x$.
Now $f(x)=f_E(x)+f_O(x)$ and $f(-x)=f_E(x)-f_O(x)$.
The total additional work compared to evaluating just $f(x)$ directly is one multiplication to create $x^2$, one multiplication to get the single $x$ factor in $f_O$, and one addition and subtraction to combine everything at the end -- no matter what the degree.