[Math] Computer Algorithm Running Time Calculation

algorithmsanalysis-of-algorithms

I have the following pseudocode to figure out the approximate running time of. Anyone able to help but please explain each step and reason. thanks in advance

int m = n;

while (m > 0)
    for k= 1 to m
        //dowork  ---- CONSTANT
      m = floor(m/2)

Another Algorithm I would appreciate a break down of please. How would I compute the running time of this algorithm?

NB. This is taken from the wiki site writeup on merge sort but since that did not help, I was wondering if someone here would help break it down so I get where the O (n log n) comes from, for both best case and worst case.

http://en.wikipedia.org/wiki/Merge_sort

A

 function merge_sort(m)
        if length(m) ≤ 1
            return m
        var list left, right, result
        var integer middle = length(m) / 2
        for each x in m up to middle
             add x to left
        for each x in m after middle
             add x to right
        left = merge_sort(left)
        right = merge_sort(right)
        result = merge(left, right)
        return result

B

 function merge(left,right)
        var list result
        while length(left) > 0 or length(right) > 0
            if length(left) > 0 and length(right) > 0
                if first(left) ≤ first(right)
                    append first(left) to result
                    left = rest(left)
                else
                    append first(right) to result
                    right = rest(right)
            else if length(left) > 0
                append first(left) to result
                left = rest(left)
            else if length(right) > 0
                append first(right) to result
                right = rest(right)
        end while
        return result

Best Answer

I don't know if another answer can be more illuminating to you, but let me try saying it a bit differently anyway.

First, consider the inner loop:

    for k= 1 to m
        //dowork  ---- CONSTANT

Because it's doing a constant amount of work $m$ times (for k=1 to m), this takes approximately time $m$, whatever the value of $m$ is. (To be precise, it takes $\Theta(m)$ time, which means that there are constants $c_1$ and $c_2$ such that the time it takes is between $c_1m$ and $c_2m$, but when finding the "approximate running time", i.e., the asymptotic complexity, we usually ignore constant factors.)

Now the outer loop looks like

m = n
while (m > 0)
      //Do 'm' amount of work
      m = floor(m/2)

where I've replaced the inner loop with what we know about its time. So the first time the loop is run, $m=n$ and it takes time $n$. By the second time the loop is run, $m$ is halved, so $m = n/2$ and it takes time $n/2$ (I'm ignoring writing $\lfloor n/2 \rfloor$, because that's within a constant factor of $n/2$.) The third time it takes $n/4$, etc. So the total time taken by the code is: $$\begin{align}&n + \frac{n}{2} + \frac{n}{4} + \frac{n}{8} + \dots \\ &= n\left(1 + \frac12 + \frac14 + \frac18 + \dots\right)\end{align}$$ until the term becomes less than $1$ (so $m$ would have become $0$ and the code would have halted).

Now, the sum $\left(1 + \frac12 + \frac14 + \frac18 + \dots\right)$ is at most $2$, so the above sum for the running time is at most $2n$. Ignoring constant factors, we can say that the code takes approximately time $n$, which is shorthand for saying that the time it takes is is linear in $n$. Or, if we did the whole thing formally, we would have said it takes time $\Theta(n)$.

(As it happens, we can analyse the number of terms in the sum: if the last term is $\frac{n}{2^k}$ (each term is of this type), then $k$ is such that $2^k \le n < 2^{k+1}$, which means $k = \lfloor \lg n \rfloor$, but all this is irrelevant to the actual problem here.)

Related Question