How do you find a corresponding recurrence relation for some random algorithm?

A general method for analyzing the running time of any algorithm is to walk through the algorithm step by step, counting the number of statements executed and express this count as a function of the "size" of the input.

For example, suppose we defined a predicate, isSorted, which would take as input an array $a$ and the size, $n$, of the array and would return true if and only if the array was sorted in increasing order. It might look like this

bool isSorted(int *a, int n) {
   if (n == 1)
      return true;       // a 1-element array is always sorted
   for (int i = 0; i < n-1; i++) {
      if (a[i] > a[i+1]) // found two adjacent elements out of order
         return false;
   }
   return true;         // everything's in sorted order
}

Clearly, the size of the input here will simply be $n$, the size of the array. How many steps will be performed in the worst case, for input $n$?

  • The first if statement counts as 1 step
  • The for loop will execute $n-1$ times in the worst case (assuming the internal test doesn't kick us out), for a total time of $n-1$ for the loop test and the increment of the index.
  • Inside the loop, there's another if statement which will be executed once per iteration for a total of $n-1$ time, at worst.
  • The last return will be executed once.

So, in the worst case, we'll have done $1+(n-1)+(n-1)+1$ computations, for a total run time $T(n)\le 1+(n-1)+(n-1)+1=2n$ and so we have the timing function $T(n)=O(n)$.

For non-recursive algorithms, the process is, in theory, simple:

  1. Find a parameter, $n$, that expresses the "size" of the input. Walk through the algorithm, assuming
  2. Every statement executed will take time 1,
  3. Every if statement will take its longest branch,
  4. Every loop will iterate the maximum number of times allowable.

Then express the run time of your algorithm as a function of the sum of statements executed, using (2) through (4) above.


For recursive algorithms, you do the same thing, only this time you add the time taken by each recursive call, expressed as a function of the time it takes on its input.

For example, let's rewrite, isSorted as a recursive algorithm:

bool isSorted(int *a, int n) {
   if (n == 1)
      return true;
   if (a[n-2] > a[n-1])         // are the last two elements out of order?
      return false;
   else
      return isSorted(a, n-1); // is the initial part of the array sorted?
}

In this case we still walk through the algorithm, counting: 1 step for the first if plus 1 step for the second if, plus the time isSorted will take on an input of size $n-1$, which will be $T(n-1)$, giving a recurrence relation $$ T(n) \le 1 + 1 + T(n-1) = T(n-1) +O(1) $$ Which has solution $T(n)=O(n)$, just as with the non-recursive version, as it happens.


With this done, lets analyze your algorithm. We'll compute the run time as a function of right - left. That will be the size, $n$, so we'll try to find a recurrence relation that expresses the run time, $T(n)$, as a function of $n$.

If $n>0$, the function will take 1 step for the if test and then pass to the second loop. This loop will iterate $n$ times, doing two swaps and calling itself recursively with arguments that we'd express as $n-1, n-2, \dots 1, 0$. In other words, we'll have $$ T(n) \le T(n-1) + T(n-2) + \cdots + T(1) + T(0) + kn $$ where the last term takes account of the loop overhead and the two swaps. We have reached our goal: the timing function for the algorithm satisfies the recurrence $$ T(n) = T(n-1) + \cdots + T(0) + O(n) $$ and of course, when $n=0$, i.e., when left = right we just print the contents of the array, which we can do in a (possibly large) constant amount of time.

Any questions? Just ask.