Is Newton-Raphson method applicable to solve rational equations? or any equation except polynomial equations.
[Math] Application of Newton-Raphson method
numerical methods
Related Solutions
Newton's method is, provided an initial guess $x_0$ to $f(x)=0$, you just iterate $x_{n+1}=x_n-\frac{f(x_n)}{f'(x_n)}$. In higher dimensions, there is a straightforward analog. So in your case, define $$f\left(\left[ \begin{array}{c} x \\ y \end{array}\right]\right)=\left[\begin{array}{c} f_1(x,y) \\ f_2(x,y) \end{array}\right]=\left[\begin{array}{c} \sin(3x)+\sin(3y) \\ \sin(5x)+\sin(5y) \end{array}\right]$$
so you throw in a vector of size two and your $f$ returns a vector of size two. The derivative is simply the 2x2 Jacobian matrix here $$J=\left[\begin{array}{cc} \frac{\partial f_1}{\partial x} & \frac{\partial f_1}{\partial y} \\ \frac{\partial f_2}{\partial x} & \frac{\partial f_2}{\partial y} \end{array}\right].$$
The only thing to be careful about is that now you have vector operations. The $f'(x)$ in the denominator is equivalent to inverting the Jacobian matrix and then you have a matrix vector multiply and then a vector subtraction. So the full equation is $$\left[ \begin{array}{c} x_{n+1} \\ y_{n+1} \end{array}\right]=\left[ \begin{array}{c} x_n \\ y_n \end{array}\right]-\left[\begin{array}{cc} \frac{\partial f_1}{\partial x} & \frac{\partial f_1}{\partial y} \\ \frac{\partial f_2}{\partial x} & \frac{\partial f_2}{\partial y} \end{array}\right]^{-1}_{(x_n,y_n)}*f \left(\left[ \begin{array}{c} x_n \\ y_n \end{array}\right]\right)$$
So the Jacobian is inverted and evaluated at the point $(x_n,y_n)$ and then multiplied by $f$. Note that the matrix is being multiplied on the left. And then you can generalize this to any dimension in exactly the same manner.
Let's look at the iterations you get from each.
With $f(x) = x - e^{-x}$, you get
$$F(x) = x - \frac{f(x)}{f'(x)} = x - \frac{x-e^{-x}}{1+e^{-x}} = \frac{x + xe^{-x} - x + e^{-x}}{1+e^{-x}} = \frac{x+1}{e^x+1}.$$
That's a nice function, defined on all of $\mathbb{R}$, gets you close to the solution real fast when you start at a non-negative $x_0$, but is not so good for negative $x$, then it approaches the fixed point slowly at first.
With $g(x) = xe^x - 1$, you get
$$G(x) = x - \frac{g(x)}{g'(x)} = x - \frac{xe^x-1}{(x+1)e^x} = \frac{x^2e^x+1}{(x+1)e^x} = \frac{x^2+e^{-x}}{x+1}.$$
That function has a pole in $-1$, and $G(x) < -1$ for $x < -1$, so doesn't reach the fixed point from there. For large positive $x$, it doesn't approach the fixed point fast (basically, it's $x \mapsto \frac{x}{x+1}\cdot x$ there).
So looking at the global behaviour, your choice behaves much better.
Now, for the local behaviour near the fixed point, the Newton iteration roughly behaves like
$$\alpha + \delta \mapsto \alpha + \frac{f''(\alpha)}{2f'(\alpha)}\delta^2$$
when $f'$ doesn't vanish in the zero of $f$, as is the case here, so let's look at the corresponding quotient for the two candidates.
$$\begin{align} \frac{g''(\alpha)}{2g'(\alpha)} &= \frac{(\alpha+2)e^\alpha}{2(\alpha+1)e^\alpha} = \frac{\alpha + 2}{2(\alpha+1)} = \frac12 + \frac{1}{2(\alpha+1)}\\ \frac{f''(\alpha)}{2f'(\alpha)} &= \frac{-e^{-\alpha}}{2(1+e^{-\alpha})} = \frac{-\alpha}{2(1+\alpha)} = \frac{1}{2(\alpha+1)} - \frac12 \end{align}$$
using $e^{-\alpha} = \alpha$ for the latter.
So $G$ approaches the fixed point from above, while $F$ approaches it from below ($f''(\alpha) < 0 < f'(\alpha)$), and the factor of the square has smaller absolute value for $F$, that means the convergence of $F$ is faster near the fixed point $\alpha$ (but that doesn't matter much against the quadratic convergence, nevertheless).
So, altogether, yours was the better choice globally (converges to the solution from all starting points), and locally (converges faster near the fixed point).
Best Answer
Newton's Method is applicable for real-valued functions. However, it does not work all the time. There are many possible reasons for Newton's Method to fail such as if the derivative doesn't exist at the root.