[Math] What will the recursion tree of Fibonacci series look like

fibonacci-numbersrecursive algorithms

I am watching the Introduction to algorithm video, and the professor talks about finding a Fibonacci number in $\Theta(n)$ time at point 23.30 mins in the video.

  1. How is it $\Theta(n)$ time?

  2. Which case of the master theorem does it fall under?

  3. What would the recursion tree look like if we try to use a divide and conquer approach?

P.S: I know pretty much what the $\Theta(n)$ notation means, so please write your answer accordingly.

Best Answer

What he's doing is using a simple example of what's known as dynamic programming, where one computes a function $f(n)$ by working up from the bottom to get to the desired result. Specifically, to compute $fib(n)$, the $n$-th Fibonacci number, he's doing this

fib(n)=
   if (n <= 1)
      return n
   else
      old = 0, prior = 1, next
      for i = 2 to n
         next = old + prior
         old = prior
         prior = next
      return next

To see this in action, we compute $fib(4)$, which we know is $3$:

i  old  prior  next
2   0     1      1    (compute next = old + prior)
2   1     1      1    (shift everything to the left)
3   1     1      2    (compute next again)
3   1     2      2    (shift again)
4   1     2      3    (and so on...)
4   2     3      3

How long does this algorithm take? No need for the Master Theorem here: we have an algorithm that consists, essentially, of a loop, so assuming you can add in constant time, this will have running time $T(n) = \Theta(n)$.

(Actually, this won't work at all on real machines, since $fib(n)$ grows so fast that the numbers will quickly exceed the size of an integer. For example, $fib(2500)$ is 523 decimal digits long, so we'd need to implement some big integer package to do the addition and assignment statements, increasing the run time of this algorithm.)


The recursion tree for the original (terrible) recursive algorithm looks like this, when computing $fib(4)$:

enter image description here

For example, the first call, to $fib(4)$, requires two recursive calls, to $fib(3)\text{ and }fib(2)$. Notice how inefficient this is: just to compute $fib(4)$ we needed nine function calls. In fact, this is a classic example of when not to use recursion, especially since there are vastly more efficient ways to do this porblem.

Related Question