Computational complexity of a modified Euclidean algorithm

asymptoticscomputational complexityelementary-number-theoryeuclidean-algorithmgcd-and-lcm

The Euclidean algorithm computes the $\gcd$ of two integers with the recursive formula

$$\gcd(a,b)=\gcd(b,a\bmod b)$$

and takes at worst $\log_\varphi(\min(a,b))$ steps, where $\varphi$ is the golden ratio.

What if instead one used

$$\gcd(a,b)=\gcd(b,b-(a\bmod b))$$

whenever $a\bmod b$ was greater than $b/2$?

It is easy to see this will take at worst no more than $\log_2(\min(a,b))$ steps since this ensures the second argument is at most half of the first argument, but what is the exact worst case constant coeefficient?

The only sequence of pairs of integers I've managed to get the exact behavior of are consecutive Fibonacci numbers, in which case this modified algorithm runs twice as fast as the usual, which is faster than the $\log_2$ bound.

Here is a program displaying the values on each step of the standard Euclidean algorithm and the above modification.

Best Answer

Since you keep $b$ less than $a/2$, each time the quotient is at least $2$ (except the first step).

So instead of the Fibonacci, you need a sequence defined by $A_n=2A_{n-1}+A_{n-2}$ with $A_0=0,A_1=1$. Then this modified algorithm need $n$ steps to compute $\gcd(A_n+A_{n-1},A_n)$ (not $\gcd(A_n,A_{n-1})$, because we should let the first quotient be $1$).

Like the proof of $F_n$ makes the original algorithm run the most steps, also we can proof $A_n$ makes this modified algorithm runs the most steps: by induction on $n$ that if $a>2b$ and $\gcd(a,b)$ need $n$ steps, then $a\geq A_{n+1},b\geq A_n$:

$n=0$ is clearly true. If $n>0$, By induction hypothesis, $b\geq A_n$. There are two cases for $A$: 1) $a\bmod b\geq b/2$, then $a\geq 2A_n+A_n/2\geq A_{n+1}$. 2) $a\bmod b<b/2$, then $a-2b\geq a\bmod b\geq A_{n-1}$, so $a\geq A_{n+1}$.

Cause $A_n\sim c(1+\sqrt2)^n$, the algorithm runs in $\log_{1+\sqrt2}(\min(a,b))$ time.