This question centers around the following theorem.
Let $I \subseteq \mathbb{R}$ be an open interval and let $g : I \rightarrow \mathbb{R}$ be differentiable with a continuous derivative $g' : I \rightarrow \mathbb{R}$. Suppose $g$ has a fixed point $\xi \in I$ and $|g'(\xi)|<1$. Then there exists a closed bounded interval $J \subset I$, such that $\xi \in J$, $g(J) \subseteq J$ and $g_{|J} : J \rightarrow J$ is a contraction, i.e., there exists $L \in [0,1)$ such that
$$\forall x, y \in J \: : \: |g(x) - g(y)| \leq L |x-y|.$$
Moreover, the fixed point iteration
$$ x_{n+1} = g(x_n)$$
converges to $\xi$ regardless of the choice of starting point $x_0 \in J$.
The central point is the existence of the interval $J$ and the (Lipschitz) constant $L < 1$. We will delay the proof a moment and instead focus on the convergence of the fixed point iteration. The delightful property that $g$ maps $J$ into itself ensures that the iteration is well defined for all $x_0 \in J$. Moreover, we have
$$ |\xi - x_{n+1}| = |g(\xi) - g(x_n)| \leq L |\xi - x_n|.$$
By induction on $n$ we have
$$ |\xi - x_n| \leq L^n |\xi - x_0|.$$
Since $L < 1$, convergence follows immediately.
Now for the construction of $J$ and the selection of $L$. By assumption, we have $|g'(\xi)| < 1$. Let $\epsilon>0$ be given by
$$ \epsilon = \frac{1 - |g'(\xi)|}{2}.$$
By the continuity of $g'$ there exists a $\delta > 0$ such that
$$ \forall z \in I \: : \: |\xi - z| < \delta \: \Rightarrow \: |g'(\xi) - g'(z)| < \epsilon.$$
It is clear that this property implies $|g'(z)| < 1$. Why? By the triangle inequality we have
$$ |g'(z)| \leq |g'(\xi)| + |g'(\xi) - g'(z)| < |g'(\xi)| + \frac{1-|g'(\xi)|}{2} = \frac{1+|g'(\xi)|}{2} < 1.$$
Since $I$ is an open interval we can without loss of generality assume that $(\xi - \delta, \xi + \delta) \subseteq I$. If necessary, we simply reduce $\delta$. We now define $$J = \left[\xi - \frac{\delta}{2}, \xi + \frac{\delta}{2}\right].$$
By design $J$ is a closed and bounded interval and we have $|g'(z)| < 1$ for all $z \in J$ which automatically implies that
$$ L = \sup \{ |g'(z)| \: : \: z \in J \} < 1.$$
At this juncture the mean value theorem finally makes it appearance. Given $x, y \in J$, there exists at least one $z$ between $x$ and $y$, such that
$$ g(x) - g(y) = g'(z)(x-y).$$
Since $z \in J$, we have $|g'(z)| \leq L$. This completes the analysis.
Higher order Taylor expansions enter into the picture when higher order derivatives of
$g$ exist and vanish at the fixed point, but this is perhaps a subject for another question.
Best Answer
Start with the original Newton's method recurrence:
$$x_{n+1}=x_n - \frac{f(x_n)}{f'(x_n)} $$
and write $x_n=r+e_n$, where $r$ is the root $f(r)=0$ and we assume $e_n$ is small in the sense that we may perform Taylor expansions, viz.
$$\begin{align}e_{n+1} &= e_n - \frac{f(r)+e_n f'(r) + \frac12 e_n^2 f''(r)+\cdots}{f'(r)+e_n f''(r)+\frac12 e_n^2 f'''(r)+\cdots}\\ &= e_n \left (1-\frac{f'(r)+\frac12 e_n f''(r)+\cdots}{f'(r) + e_nf''(r)+\cdots} \right ) \\ &= \frac{f''(r)}{2 f'(r)} e_n^2 + O \left (e_n^3 \right )\end{align}$$
Of course we assumed that $f'(r) \ne 0$ in the above. I leave it to the reader to derive expressions for such a case and other odd cases.
This is of course the well-known result of quadratic convergence of Newton's method (so long as you made a good initial guess).
The question is what happens if the recurrence becomes
$$x_{n+1}=x_n - \frac{f(x_n)}{f'(x_0)} $$
Define $x_0 = r+e_0$. That is, $e_0$ is the error in the first guess. Using the same technique as above, I find that
$$e_{n+1}=\frac{f''(r)}{f(r)}e_n (e_0-e_n) +O \left (e_n^2 \right )$$
Now, $e_0$ is fixed. Thus we may write
$$e_{n+1}=\frac{f''(r) e_0}{f(r)}e_n+O \left (e_n^2 \right )$$
Thus $s=1$ and $C=\frac{f''(r) e_0}{f(r)}$.