void doSomething(int *a, int left, int right) {
if (left == right) {
for (int j = 0; j < right; ++j)
cout << a[j];
cout << endl;
return;
}
for (int i=left; i<right; ++i) {
std::swap(a[left],a[i]);
doSomething(a,left+1,right);
std::swap(a[left],a[i]);
}
}
With base case T(1) = right, swap runs in O(1) time, and n = right-left+1 for T(n).
Can someone please walk me through the steps one would take to find the recurrence relation for this algorithm (or any algorithm for that matter)? In particular, I'm confused with the terminology of left and right, or how n = right-left+1 and T(1) = right effect how we determine what a final solution should look like.
Best Answer
A general method for analyzing the running time of any algorithm is to walk through the algorithm step by step, counting the number of statements executed and express this count as a function of the "size" of the input.
For example, suppose we defined a predicate, isSorted, which would take as input an array $a$ and the size, $n$, of the array and would return true if and only if the array was sorted in increasing order. It might look like this
Clearly, the size of the input here will simply be $n$, the size of the array. How many steps will be performed in the worst case, for input $n$?
So, in the worst case, we'll have done $1+(n-1)+(n-1)+1$ computations, for a total run time $T(n)\le 1+(n-1)+(n-1)+1=2n$ and so we have the timing function $T(n)=O(n)$.
For non-recursive algorithms, the process is, in theory, simple:
Then express the run time of your algorithm as a function of the sum of statements executed, using (2) through (4) above.
For recursive algorithms, you do the same thing, only this time you add the time taken by each recursive call, expressed as a function of the time it takes on its input.
For example, let's rewrite, isSorted as a recursive algorithm:
In this case we still walk through the algorithm, counting: 1 step for the first if plus 1 step for the second if, plus the time isSorted will take on an input of size $n-1$, which will be $T(n-1)$, giving a recurrence relation $$ T(n) \le 1 + 1 + T(n-1) = T(n-1) +O(1) $$ Which has solution $T(n)=O(n)$, just as with the non-recursive version, as it happens.
With this done, lets analyze your algorithm. We'll compute the run time as a function of right - left. That will be the size, $n$, so we'll try to find a recurrence relation that expresses the run time, $T(n)$, as a function of $n$.
If $n>0$, the function will take 1 step for the if test and then pass to the second loop. This loop will iterate $n$ times, doing two swaps and calling itself recursively with arguments that we'd express as $n-1, n-2, \dots 1, 0$. In other words, we'll have $$ T(n) \le T(n-1) + T(n-2) + \cdots + T(1) + T(0) + kn $$ where the last term takes account of the loop overhead and the two swaps. We have reached our goal: the timing function for the algorithm satisfies the recurrence $$ T(n) = T(n-1) + \cdots + T(0) + O(n) $$ and of course, when $n=0$, i.e., when left = right we just print the contents of the array, which we can do in a (possibly large) constant amount of time.
Any questions? Just ask.