I am watching the Introduction to algorithm video, and the professor talks about finding a Fibonacci number in $\Theta(n)$ time at point 23.30 mins in the video.
-
How is it $\Theta(n)$ time?
-
Which case of the master theorem does it fall under?
-
What would the recursion tree look like if we try to use a divide and conquer approach?
P.S: I know pretty much what the $\Theta(n)$ notation means, so please write your answer accordingly.
Best Answer
What he's doing is using a simple example of what's known as dynamic programming, where one computes a function $f(n)$ by working up from the bottom to get to the desired result. Specifically, to compute $fib(n)$, the $n$-th Fibonacci number, he's doing this
To see this in action, we compute $fib(4)$, which we know is $3$:
How long does this algorithm take? No need for the Master Theorem here: we have an algorithm that consists, essentially, of a loop, so assuming you can add in constant time, this will have running time $T(n) = \Theta(n)$.
(Actually, this won't work at all on real machines, since $fib(n)$ grows so fast that the numbers will quickly exceed the size of an integer. For example, $fib(2500)$ is 523 decimal digits long, so we'd need to implement some big integer package to do the addition and assignment statements, increasing the run time of this algorithm.)
The recursion tree for the original (terrible) recursive algorithm looks like this, when computing $fib(4)$:
For example, the first call, to $fib(4)$, requires two recursive calls, to $fib(3)\text{ and }fib(2)$. Notice how inefficient this is: just to compute $fib(4)$ we needed nine function calls. In fact, this is a classic example of when not to use recursion, especially since there are vastly more efficient ways to do this porblem.