[Math] Understanding accuracy of Newton’s Method

logarithmsnumerical methodsroots

In a numerical analysis book I'm reading it says that using the Newton error formula we can find an expression for the number of correct digits in an approximation using Newton's Method.

Here's the derivation. Starting with the Newton error formula –

$$|\alpha – x_{n+1}| = \frac{1}{2}(\alpha – x_n)^2\frac{f''(\epsilon_n)}{f'(x_n)}$$

taking $\log_{10}$ of both sides –

$$\log_{10}|\alpha – x_{n+1}| = 2\log_{10}|\alpha – x_n| + \log_{10}\left(\frac{f''(\epsilon_n)}{2f'(x_n)}\right) = 2\log_{10}|\alpha – x_n| + b_n$$

It then states that
$$d_n = \log_{10}|\alpha – x_n|$$
can be interpreted as the number of correct digits in the approximation (so long as the error is less than 1) and that the middle equation show that (ignoring the b_n) term this doubles each iteration.

Unfortunately statements using logarithms often don't seem naturally intuitive to me so can someone explain –

  1. Why $d_n = \log_{10}|\alpha – x_n|$ gives the number of correct decimal digits.
  2. How the middle equation show that this doubles on each iteration.

Best Answer

To answer your first question, consider that $\text{ceil}(\log_{10} a)$ gives you an estimate of the number of digits of $a$. To wit,

$$\log_{10} 15 \approx 1.176$$ $$\log_{10} 156 \approx 2.193$$ $$\log_{10} 5632 \approx 3.751$$

In other words, 5632 is 10 raised to a number that is not quite 4 (since $10^4 = 10000$).

So, when you compute the difference, taking $\log_{10}$ should give you an estimate of the number of digits in the error.

To answer your second question, since the log term has a coefficient of 2, the estimate of the number of digits doubles each time.