Space complexity of recursive function

Here's how I think about it:

  • The temptation is to say that the space complexity will also be O(2^N), because after all, memory has to be allocated for each of the O(2^N) recursive calls, right? (not right)
  • In actuality the values are added together/collapsed at each call and thus the space required will just be the result of each call starting at the base case on up, forming the array [f(1), f(2), f(3) ... f(n)], in other words just O(n) memory

A useful way to approach these types of problems is by thinking of the recursion tree. The two features of a recursive function to identify are:

  1. The tree depth (how many total return statements will be executed until the base case)
  2. The tree breadth (how many total recursive function calls will be made)

Our recurrence relation for this case is T(n) = 2T(n-1). As you correctly noted the time complexity is O(2^n) but let's look at it in relation to our recurrence tree.

      C
     / \         
    /   \      
T(n-1)  T(n-1)

            C
       ____/ \____
      /           \
    C              C   
   /  \           /  \
  /    \         /    \
T(n-2) T(n-2) T(n-2)  T(n-2)

This pattern will continue until our base case which will look like the following image:

enter image description here

With each successive tree level, our n reduces by 1. Thus our tree will have a depth of n before it reaches the base case. Since each node has 2 branches and we have n total levels, our total number of nodes is 2^n making our time complexity O(2^n).

Our memory complexity is determined by the number of return statements because each function call will be stored on the program stack. To generalize, a recursive function's memory complexity is O(recursion depth). As our tree depth suggests, we will have n total return statements and thus the memory complexity is O(n).