Algorithms – Approximating Square Root of 2

algebra-precalculusalgorithmsirrational-numbersradicals

Well, "Solving" is the wrong term since I am speaking about irrational numbers. I just don't know which word is the correct word… So that can be part $1$ of my question… what is the correct word since you obviously can't "solve" an irrational number because it goes forever.

Part $2$ (my real question) are there algorithms for figuring out the answer to a problem like the square root of $2$ other than guess-and-checking your way to infinity? Again, I'm obviously not asking for an algorithm to give me the never ending answer because that's crazy… but for example if I wanted to know what the $15^{th}$ decimal place of the square root of $2$ was, is there an algorithm for that?

Thank you! (I'm new here and know nothing about how to format math questions so any help or links would be appreciated as well, thanks!)

Best Answer

You can use newton's method to compute the digits of $\sqrt{(2)}$:
Let: $$ f(x) = x^2 -2 $$ Define the iteration: $$ x_0 = 1\\ x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)} $$ This will converge to $\sqrt{2}$ quadratically.

If you want to compute other square roots:

Consider:
$$g(x) = x^2 - a$$


Which has the iterants: $$ x_{n+1}=\frac{1}{2}\left(x_n+\frac{a}{x_n}\right) $$ As mentioned below.

There's also what's called the continued fraction expansion of an algebraic number. You can use a finite continued fraction expansion.


As an example: $$ x_0 = 1 \\ x_1 = \frac{1}{2}\left(x_0 + \frac{2}{x_0}\right) =\frac{1}{2}\left( \large \textbf{1} + \frac{2}{ \large \mathbf{1}}\right) = \frac{3}{2}\\ x_2 = \frac{1}{2}\left(x_1 + \frac{2}{x_1}\right) = \frac{1}{2}\left( \large \mathbf{\frac{3}{2}} + \frac{2}{ \large \mathbf{\frac{3}{2}}}\right), \text{ etc. } $$

Added

Since we are using Newton's method, and you are wondering why it converges to the root of $f(x)$,

Note the following:
$\textbf{Theorem} $: Suppose that the function $f$ has a zero at $\alpha$, i.e., $f(\alpha) = 0$

If $f$ is continuously differentiable and its derivative is nonzero at $\alpha$, then there exists a neighborhood of $\alpha$ such that for all starting values $x_0$ in that neighborhood, the sequence ${x_n}$ will converge to $\alpha$.

So if we choose our starting guess appropriately, Newton's method always converges to the root of the equation if $f$ has these properties .

Related Question