Most numerical packages give you the option of either computing the Jacobian yourself and passing it to the solver, or of numerically approximating it with a finite difference scheme. I imagine that in general while performing Newton's method or other methods expressed in terms of an inverse Jacobian, these packages do not actually compute the inverse for reasons of stability. Instead, they solve the linear system $J(x_n) x_{n+1} = J(x_n) x_n - f(x_n)$ for $x_{n+1}$ at each time step.
The Jacobian is not always invertible; in order to use Newton's method the Jacobian must be invertible, though. You can see this in one dimension, with a function $\mathbb{R} \to \mathbb{R}$ with a critical point which is not an extremum.
Partial answer : Neither condition is sufficient. Let $f:R \to R$ be twice differentiable . Let $a\in (0,\pi /6)$ and $b=\pi /4.$ Let $f(x)=-\sqrt 3 /2+\cos x$ for $x\in [a,b].$ If $a$ is close enough to $0$ and the starting point $x_1$ is close enough to $a$ then the slope $f'(x_1)=-\sin x_1$ will be so close to $0$ that $x_2=x_1-f(x_1)/f'(x_1) >b.$ The behavior of $f$ on $(b,\infty)$ is indeterminate (except that $f$ is twice differentiable on $R$). For example we could have $f'(x_2)=0\ne f(x_2),$ and then the next iterate $x_3$ doesn't exist. If some iterate $x_n$ can fail to lie in $[a,b]$ then conditions on $f(x),f'(x),$ and $f''(x)$ for $x'\in [a,b]$ will not suffice.
For differentiable $f$:
One well-known sufficient set of conditions is (i) $f(a)<0<f(b),$ (ii) $f'(a)>0$ and $f'$ is increasing (not necessarily strictly increasing ) on $[a,b],$ (iii) $a-f(a)/f'(a)\leq b.$.... Then there is a unique $x_0\in (a,b)$ such that $f(x_0)=0.$ Let $x^*=x-f(x)/f'(x).$ If $x\in [a,x_0)$ then $x^*\in [x_0,b].$ If $x\in (x_0,b]$ then $x^*\in [x_0,x).$
Another well-known sufficient set of conditions is (i) $f(a)<0<f(b),$ (ii) $f$ is monotonic on $[a,b],$ (iii) $f$ is concave on $[a,x_0),$ and $f$ is convex on $(x'_0,b],$ where $x_0=\min \{x\in [a,b]: f(x)=0\}$ and $x'_0=\max \{x\in [a,b]:f(x)=0\}.$... If $x\in [a,x_0)$ then $x^*\in (x,x_0].$ If $x\in (x'_0,b]$ then $x^*\in [x'_0,x).$ (With $x^*$ as in the previous paragraph.)
Proof for the 1st set of conditions:
- $f'>0$ on $[a,b]$ so $f$ is strictly increasing and continuous on $[a,b]$ with $f(a)<0<f(b).$ This implies the existence and uniqueness of $x_0.$
- If $x\in (a,x_0)$ then $f(a)<f(x)<0$ and $f'(x)\geq f'(a)>0.$ So the line thru the point $(x,f(x))$ with slope $f'(x)$ will intersect the $x$-axis at $(x^*,0) ,$ while the line thru $(a,f(a))$ with slope $f(a)$ will intersect the $x$-axis at $(a^*,0),$ where $x<x^*<a^*\leq b.$
- To show that $x\in [a,x_0)\implies f(x^*)\geq 0$ :Observe that $(f(x^*)-f(x))/(x^*-x)=f'(y)$ for some $y\in (x,x^*).$ And $x<x^*$ so $f(x)<f(x^*)$. And we have $$f(x)<f(x^*)<0\implies 0<\frac {f(x^*)-f(x)}{-f(x)}<1 . $$ But then we have $y>x$ and$$f(x^*)<0\implies \ f'(y)=
\frac {f(x^*)-f(x)}{x^*-x}=$$ $$=\frac {f(x^*)-f(x)}{-f(x)/f'(x)}= \frac {f(x^*)-f(x)}{-f(x)} \cdot f'(x)<f'(x),$$ contradicting the fact that $f'$ is increasing on $[a,b].$
- To show that $x\in (x_0,b]\implies f(x^*)\in [x_0,x)$: The slope $S$ of the line joining $(x_0,f(x_0)=(x_0,0)$ to $(x,f(x))$ is equal to $f'(y)$ for some $y\in (x_0,x),$ so $0<S\leq f'(x).$ So the line thru $(x,f(x))$ with slope $f'(x)$ will intersect the $x$-axis at $(x^*,0),$ with $x_0\leq x^*<x.$
- Finally if $x_1\in [a,b]$ and $x_{n+1}=x^*_n,$ then $x_0\leq x_{n+1}\leq x_n\leq b$ for $n\geq 2.$ So $(x_n)_{n\in N}$ converges to a limit $y\in [a,b].$ And since $f'(x)\geq f'(a)>0$ for all $x\in [a,b]$ we have $$0=\lim_{n\to \infty} |f(x_{n+1})-f(x_n)|= \lim_{n\to \infty}|f(x_n)/f'(x_n)|\geq \lim_{n\to \infty}|f(x_n)|/f'(a)=|f(y)|/f'(a).$$ So $f(y)=0.$
Best Answer
If you don't know $m$, you could always use what Burden and Faires call the "modified Newton's Method," which is to apply Newton's method to $\mu(x) = f(x)/f'(x)$.
But both the modified Newton's Method, and the method you propose, will suffer from the same practical problem. As $x_n$ gets close to the root, $f'(x_n)$ gets very close to zero. Thus $f(x_n)/f'(x_n)$ is very close to $0/0$. So while the answer isn't theoretically undefined, the numerical errors will be large, and eventually your iterates will be "Nan" (which stands for "not a number").
Anyway, if you do want to use the method you propose, then I think the best way to find $m$ is to try the accelerated Newton's Method for different $m$, and see which one converges fast.
This is essentially equivalent to the method you propose, that is, finding out how $f(x)/f'(x)$ compares to $(x-\alpha)/m$ for $x$ close to $\alpha$. This is because your method requires knowing what $\alpha$ is, and the only approximations to $\alpha$ you have are the iterates of your algorithm.