First question: We look at the equation $x=g(x)=x^2+x-4$. The solutions are $x=\pm 2$. If we are phenomenally lucky, and, say, make the initial estimate $x=2$, everything is fine. Now let's see what happens if we make an initial estimate which is good but not dead on. Say we are using fixed point iteration to estimate the positive root $x=2$.
The problem is that $x=2$ is a repelling fixed point. The reason it is repelling is that $g'(x)=2x+1$, so near $x=2$, $|g'(x)|$ is near $5$, quite a bit greater than $1$.
Roughly speaking, if we are dealing with a "nice" function, and the derivative at the root has absolute value substantially less than $1$, the root will be an attracting fixed point. In that case, if we start close enough to the root, fixed point iteration sucks us into the root. But if the absolute value of the derivative is greater than $1$, we are driven away from the root.
One can draw a very nice picture of the process, and see the repulsion at work. In the absence of a picture, we will use a formula.
Note that $g(2)=2$. Let $x$ be near $2$ but not equal to $2$.
We have
$$\frac{g(x)-2}{x-2}=\frac{g(x)-g(2)}{x-2}\approx g'(2)=5.$$
Thus if $x$ is "near" $2$, then
$$g(x)-2\approx 5(x-2).$$
This means that $g(x)$ is about $5$ times as far from $2$ as $x$ is from $2$.
If we have a good estimate $x_n$ for the positive root, then $x_{n+1}$, the next estimate, is $g(x_n)$, and thus $x_{n+1}$ is about $5$ times further away from the root than $x_n$ was.
The same issue arises at the negative root $x=-2$. We have $g'(-2)=-3$, so the derivative has absolute value $3$. Again, $x=-2$ is a repelling fixed point.
Second question: For the Newton-Raphson process question, the root is of course $0$. The problem is that the derivative of $x^{1/3}$ blows up as we approach the root. In a sense, the issue is somewhat similar to the one in the first question, since Newton-Raphson can be thought of as a sophisticated form of fixed point iteration. We are doing fixed point iteration to solve $g(x)=x$, where $g(x)=x-\frac{f(x)}{f'(x)}$.
For Newton-Raphson for $f(x)=0$, here is a rough use guideline. It should behave fairly nicely if (i) we start close enough to the root and (ii) $f(x)$ is twice differentiable at and near the root and (ii) $f'(x)$ is not $0$ at the root. There is another possible issue. If the root is a multiple root (silly example $f(x)=x^4$) then the convergence will be slow.
For the particular example in the question, we can compute explicitly and see very clearly what happens.
Recall that the Newton-Raphson iteration for approximating solutions of $f(x)=0$ is given by
$$x_{n+1}=x_n -\frac{f(x_n)}{f'(x_n)}.$$
Let $f(x)=x^{1/3}$. then, at least for positive $x$, we have $f'(x)=-(1/3)x^{-4/3}$.
Substitute in the Newton-Raphson iteration. After some simplification, we obtain
$$x_{n+1}=4x_n.$$
Very bad news! Even if we are "lucky" enough to start with a good approximation $x_0$ to the root, say $x_0=1/1000$, after one iteration we will be $4$ times as far away from the truth, after two iterations we will be $16$ times as far away from the truth, and so on.
Best Answer
False-position method is another name for regula falsi.
The difference to the secant method is the bracketing interval.
Yes, bracketing interval methods ensure convergence, as they shrink the interval that is known---via intermediate-value theorem---to contain a root.
The secant method, if it converges to a simple root, has the golden ratio $\frac{\sqrt5+1}2=1.6180..$ as superlinear order of convergence.
Bisection, in only considering the length of the bracketing interval, has convergence order 1, that is, linear convergence, and convergence rate $0.5$ from the halving of the interval in every step.
Regula falsi in most cases does not converge in the sense of shrinking the interval towards zero length. In such a situation, one side of the interval will converge to the root linearly, with order 1. And most likely a very unfortunate convergence rate close to $1$.
That is why anti-stalling versions of the method exist. For instance the Illinois modification shrinks the interval systematically towards zero length and reduces the error by the third power every 3 steps in a quite periodic pattern, so has a superlinear order of convergence of $\sqrt[3]3= 1.4422..$. One might claim that this is still faster than the $\sqrt2=1.4142..$ per function evaluation of the Newton method.
See Accuracy of approximation using linear interpolation