[Math] Why doesn’t greedy algorithm work for this set of coins in change-making problem

algorithms

A well-known Change-making problem, which asks

how can a given amount of money be made with the least number of coins of given denominations

for some sets of coins will yield an optimal solution by using a greedy algorithm (grab the highest value coin).

My question is why it leads to an optimal solution for the set of coins (e.g. 25, 10 , 5, 2, 1) but not for the set of coins (e.g. 10, 7, 2, 1)? For example the second set fails for 15 change. It yields to (10, 2, 2, 1) while the optimal solution is (7, 7, 1). What rule should govern the set of coins so that it provide an optimal solution?

Best Answer

I thought a google search would find an answer. To my surprise this turns out to be a pretty deep question.

From https://cs.uwaterloo.ca/~shallit/Papers/change2.pdf

  1. Suppose we are given $N$ and a system of denominations. How easy is it to determine if the greedy representation for $N$ is actually optimal? Kozen and Zaks [4] have shown that this problem is co-NP-complete if the data is provided in ordinary decimal, or binary. This strongly suggests there is no efficient algorithm for this problem.

The reference is from D. Kozen and S. Zaks. Optimal bounds for the change-making problem. Theoret. Comput. Sci. 123 (1994), 377–388.

From the same paper however:

Suppose we are given a system of denominations. How easy is it to decide whether the greedy algorithm always produces an optimal representation, for all values of $N$? It turns out that this problem can be solved efficiently...