How does DP helps if there are no overlapping in sub problems [0/1 knapsack]
The running time of a recursive with memoization solution can approach 2^n
if the none of the weights add up to any of the others and the capacity is large enough to include the total weight.
A dynamic programming solution would be O(c*n)
though. Which is polynomial in terms of the capacity rather than the number of items.
In your example n=3
and c=10
, so 2^n = 8
and c*n = 30
. Here c*n
is larger than 2^n
, so the dp solution isn't helping. If you instead had a larger number of items and a small capacity, then the dp solution would be better. These are the kinds of constraints a dp solution would work well on.
DP doesn't help at all on your specific problem instance. But in general, i.e. over all possible input instances, it never solves more subproblems than the pure recursion, and on many instances it solves much fewer. That's why DP is good.
All your DP formulation guarantees is that it can solve any instance of the problem by solving at most n(c+1)
subproblems, and indeed it does so for your example: here n
= 3 and c
= 10, and it solves the problem by solving 14 <= 33 subproblems (including the original problem).
Similarly, the purely recursive solution guarantees that it can solve any instance of the problem by solving at most 2^n
subproblems.
You seem to be thinking that the DP algorithm should solve every problem instance faster than the recursive algorithm, but this is not true, and no one is making this claim. There exist instances (like yours) for which there are no overlapping subproblems, and for these instances the DP solves the problem using the exact same number of subproblems as the recursive solution. This does not say anything about the behaviour of the two algorithms in general. In general, the DP solves every problem using at most as many subproblems as the recursive solution, and sometimes much fewer -- since there do exist problem instances for which the recursive algorithm needs to solve the same subproblem more than once.
In short: The DP is never worse than the recursion, and is better than the recursion in the worst case. That does not imply that it is better on every instance.
The 0/1 knapsack problem has a pseudo-polynomial-time solution. The running time of that solution is O(nW) where n
is the number of items to choose from, and W
is the weight that the knapsack can hold. The running time is described as pseudo polynomial because W
is not a function of n
.
Or is it? Given a list of n
items (by weight and value), there exists a maximum weight for one item, call it k
. (That maximum weight can be determined in O(n) time.) If W
is greater than or equal to kn
, then the problem is trivial, all the items fit in the knapsack. Therefore, we only need to consider the cases where W < kn
. So for the purposes of complexity analysis, W
can be expressed as a function of n
.
Given that nW <= k n^2
, the running time of the algorithm is O(k n^2).
Now, the pedants in the audience will argue (correctly) that this is still pseudo-polynomial time, since there's no defined relationship between k
and n
. But any concrete statement of the problem (which lists items by weights and value) must necessarily have an explicit constant value for k
in the problem statement itself.
Enough with the theory. How do we apply this to the example in the question.
n
is clearly 3k
is clearly 7
Hence, the predicted running time is O(k n^2) = O(7x3x3) = 63. But the predicted exponential running time (without DP) is O(2^n) = O(2^3) = 8.
And there's the problem with your example. Big-O analysis describes the asymptotic behavior of algorithms for large values of n
. It tells you little to nothing about the performance for small values of n
. If the value of n
is small enough that 2^n < k n^2
, then there's no expectation that DP will improve the running time of the algorithm, or have any effect at all.
Your challenge is to find an example where 2^n > k n^2
, and you still don't have overlapping sub problems.