This is an answer based on my comments above. There is indeed an integral version of Newton's method for algebraic equation. Say you have an equation:
$$
f(x) = 0,
$$
then we can set up an initial value problem:
$$
\begin{cases}
x'(t) = \frac{\alpha}{1+t^{\beta}}f(x),
\\[3pt]
x(0) = x_0.
\end{cases}
$$
As you can see, the equilibrium solution for above ODE is when $f(x) = 0$, i.e.,
$$
\lim_{t\to \infty} x(t) = r,
$$
where $r$ is one real root of $f(x)$.
In that ODE, $\alpha$ can be positive or negative depends on $f(x_0)$'s sign, and we would like to the solution decays to the equilibrium solution pretty fast, for e.g. choosing $\beta = 1$.
This method has two advantages:
Let's use that infamous $x^3 -2x +2 =0$ for example, if your initial guess is $1$ or $0$ then you will end up with oscillating forever between $1$ and $0$ (Please see the wiki's entry for Newton's method).
Using the ODE approach, set up the following initial value problem with initial guess $0$, and $\alpha = 1, \beta = -1$.
$$
\begin{cases}
x'(t) = -\frac{t}{1+t }(x^3 -2x +2),
\\[3pt]
x(0) = 0.
\end{cases}
$$
Choosing time step $h= 0.05$, and we can see the solution $x(t)$ converges pretty fast to the equilibrium solution $x_e\approx -1.7692923542386314152$, which is the root for $x^3 -2x +2 =0$:
The philosophy behind this is that: No matter integration or differentiation, we just need that contraction in the spaces where the solution lies, and this contraction must be "good", so that we can get a good approximation after a few iterations.
Consider a first order linear differential equation $y'+Py=Q,\,y(a)=y_0$ .Here continuity of $P$ and $Q$ ensure that the the ODE has an unique solution.But in case of non-linear initial value problem i.e $y'=f(x,y),\,y(x_0)=y_0$,continuity of $f$ does not ensure the unique solution.So we need to address the following question:
(1)Under what condition on $f$ the problem $y'=f(x,y),\,y(x_0)=y_0$ has a solution$?$
(2)If solution exists,whether it is unique or not$?$
$\to$First question is answered by the Peano existence theorem which states that "let $f$ be a continuous function in an interval $I$ containing the points $(x_0,y_0)$,then the the problem $y'=f(x,y),\,y(x_0)=y_0$ has a solution".
$\to$Second question is answered by Picard's uniqueness theorem which states that "let $f$ and $\frac{\delta f}{\delta y}$ are continuous in aregion R containing the initial points $(x_0,y_0)$ then the the problem $y'=f(x,y),\,y(x_0)=y_0$ has an unique solution"
Picard method for interval of definition:let $f$ and $\frac{\delta f}{\delta y}$ are continuous in a closed rectangle $$R=\{(x,y):|x-x_0|\leq a,|y-y_0|\leq b\}$$.Then the IVP $y'=f(x,y),\,y(x_0)=y_0$ has an unique solution in the interval $|x-x_0|\leq h=min{(a,\frac{b}{l})}$ where $l=MAX_{(x,y)\in R}|f(x,y)|$
Hope this will help you!!!
Best Answer
Let's take a simple analogy: The equation $ax=\tan x$ can be seen as a fixed-point equation with iteration $x_{k+1}=\frac1{a}\tan x_k$. However, in general this iteration does not converge or even stay bounded. Now with a simple transformation using the inverse tangent, one gets a family of fixed-point iterations $x_{k+1}=n\pi+\arctan(ax)$ which converges very nicely for every $n\ne 0$.
With differential equations it is the same, any fixed-point form of the equation $x'=v(t,x)$ that contains the derivative contains with it an operation that is unbounded, even more, you would have difficulties to find a space where it could be a fixed-point iteration. Inverting the differentiation by integration, $x(t)=x(0)+\int_0^t v(s,x(s))ds$, results in a fixed-point operator that maps continuous functions to continuous functions, so the space question is trivially solved. The "smoothing" properties of the integration also lead to the contraction property that one needs for convergence.