[Math] Number of steps in Euclidean algorithm

elementary-number-theory

I have been reading Stark, H. (1978). An Introduction to Number Theory and there is a section where the author proves that the algorithm must eventually reach $0$.

If we are looking for GCD($a, b$) and $a > b$ then the number of steps $k$ needed to reach $0$ is always $k < a$. Worst case being that $k = a – 1$ and at every step the remainder is always reduced by $1$ (though that would rarely if ever happen). So that has to be the maximum number of steps needed for the algorithm to terminate.

Now I have seen some other calculations with $a + b$ being the upper bound or some stuff with Fibonacci numbers. Do such things calculate the maximum number of steps or something else because I am pretty sure that the maximum number of steps is less than $a$. I may be wrong. Can anyone provide some insight into this?

EDIT

Confusion arises from the following answer where the user says that the upper bound on the number of steps is the sum of $a, b$ which is too large. Number of steps should be less than $a$.

Quote from the book:

it is clear that, sooner or later, some $r_k$ will equal $0$ and, in fact, since each $r_k$ is at least one smaller than the $r_k$ before it, we will come to a $r_k = 0$ with $k < a$.

Best Answer

The right answer is given by that Fibonacci number "stuff".

The decrease will be the slowest when every quotient is one, i.e. when the divisions are mere subtractions. And the longest when the $\gcd$ is one. If we backtrack from $a=1,b=1$, doing additions only, we get the Fibonacci sequence. It is in fact possible to show that if $\min(a,b)$ doesn't exceed $F_m$ at a given step, it cannot exceed $F_{m-1}$ at the next.

As $F_m\sim\phi^m$, the growth is exponential, and conversely, the maximum number of steps from a given $n$ is logarithmic.