[Math] Fastest Square Root Algorithm

algorithmscomputational mathematicsconvergence-divergencerootssequences-and-series

(edit, 9 years later… hello smart contract developers, I know that's why you're here lol)

What is the fastest algorithm for finding the square root of a number?

I created one that can find the square root of "$987654321$" to $16$ decimal places in just $20$ iterations

I've now tried Newton's method as well as my own method (Newtons code as seen below)

What is the fastest known algorithm for taking the second root of a number?

My code for Newton's Method (*Edit: there was an error in my code, it is fixed in the comments below):

    a=2   //2nd root
    b=97654321   //base
    n=1   //initial guess
    c=0   //current iteration (this is a changing variable)
    r=500000   //total number of iterations to run
    while (c<r) 
    {
        m = n-(((n^a)-b)/(a*b))  //Newton's algorithm
        n=m
        c++;
        trace(m + "  <--guess   ...   iteration-->  " + c)
    }

Best Answer

If you use Halley's method, you exhibit cubic convergence! This method is second in the class of Householder's methods.

Halley's method is: $$ x_{n+1} = x_n - \frac{2f(x_n)f'(x_n)}{2[f'(x_n)]^2-f(x_n)f''(x_n)} $$ If we let $$f(x) = x^2 - a$$ which meets the criteria, (continuous second derivative)

Then Halley's method is:

$$ x_{n+1} = x_n - \frac{\left(2x_n^3 - 2ax_n\right)}{3x_n^2 + a} $$ Which has the simplification: $$ x_{n+1} = \frac{x_n^3 + 3ax_n}{3x_n^2 + a} $$ I also will add this document which discusses extensions of the newtonian method.

There exists an extension due to Potra and Pták called the “two-step method” that may be re-written as the iterative scheme $$x_{n+1} =x_n − \frac{f(x_n)+f\left(x_n − \frac{f(x_n)}{f′(x_n)}\right)}{f'(x_n)}$$ that converges cubically in some neighborhood of of the root $x^{*}$ which does not require the computation of the second derivative.

See: On Newton-type methods with cubic convergence for more information on this topic.

As Hurkyl and others have noted, your best bet is to just use Newton's Method. These alternative methods generally come with more operations per iteration. They aren't really worth the computational cost, but they are a good comparison.