I presume the restriction on division applies to choosing a starting point as well?
The function $f$ is strictly decreasing on the domain $(0,\infty)$, $\lim_{x \downarrow 0} f(x) = \infty$, $\lim_{x\to \infty} f(x) = -R$, hence there exists a unique $x^*$ such that $f(x^*) = 0$. The picture below shows the behavior for $R=10$.
Furthermore, $f$ is strictly convex on its domain, which means, in this case, if an iteration $x_n$ satisfies $x_n < x^*$, then it is easy to show that $x_n < x_{n+1} \leq x^*$. Furthermore, if $x_n>x^*$, and if $x_{n+1}$ lies in the domain of $f$, then $x_{n+1} \leq x^*$.
So, the only real restriction on a starting point is to ensure that $x_1 \in (0, \infty)$ so that subsequent iterations are well defined. In this case, you have $x_1 = x_0(2-R x_0)$, giving $x_1>0$ iff $x_0 < \frac{2}{R}$.
So the answer is that Newton's method will converge iff you start with $0 < x_0 < \frac{2}{R}$, or, since you are not allowed division, choose $x_0>0$ such that $R x_0 < 2$.
It is instructive to look at the Newton iteration itself. In this case, $\phi(x) = x(2-Rx)$ defines the iteration scheme (ie, $\phi_{n+1} = \phi(x_n)$). We know the solution is a fixed point of $\phi$, that is $\phi(\frac{1}{R}) = \frac{1}{R}$, which gives $\phi(x) - \phi(\frac{1}{R}) = - R (x-\frac{1}{R})^2$. So we have $|\phi(x) - \phi(\frac{1}{R})| = (R |x-\frac{1}{R}|) |x-\frac{1}{R}|$.
This is a contraction whenever $x \in (0, \frac{2}{R})$. However, as $x$ gets closer to $\frac{1}{R}$, the 'error' term (ie, distance between $x_n$ and the solution $\frac{1}{R}$) drops with the square of the previous error (ignoring the $R$ for simplicity), which gives Newton's method its so called quadratic convergence rate.
In practice, you may draw a rough graph of the function. You see approximately where are the roots. This give you for each one an approximate value to start the recurrence process.
The first root obviously is $1$. The second is around $3.25$ and the third around $6.75$ The convergence will be fast in starting from these values.
In fact, the analytic solving leads to first $1$ , second $5-\sqrt3 = 3.267949...$ and third $5+\sqrt3 = 6.732051...$
Best Answer
For a polynomial with real coefficients the Newton iteration starting from real points will always stay on the real axis, as all the arithmetic operations stay in the reals.
For polynomial functions the Newton iteration is contractive towards the area of the roots. There are counter examples of oszillating iterations, but in general the iteration will converge from a majority of complex initial points. It helps to start close to the root location as far away from them the convergence is linear with contraction factor $\approx 1-\frac1{\deg(f)}$.
See also Newton fractals to illustrate the chaotic relation between starting points and the roots the iteration converges to.
If you instead try to solve the equivalent $0=g(z)=e^{iεz}f(z)$ with the Newton step $$ z_+=z-\frac{g(z)}{g'(z)}=z-\frac{f(z)}{f'(z)+iεf(z)} $$ with some small real $ε$, then iterations starting close to a real solution will still converge to that real solution, however real starting points far from a solution might also converge towards complex roots.