We have a set of jobs $J_i = (a_i, b_i)$ where all the $a_i,b_i \ge 0$. We want to find the permutation $\pi$ which minimises $$\max_j \left\{ b_{\pi(j)} + \sum_{k \le j}a_{\pi(k)} \right\}$$
Suppose we have proven that for all job sets $X$ of size no greater than $n$ the permutation which sorts the jobs by descending $b_i$ breaking ties by descending $a_i$ is an optimal solution (bearing in mind that there may be more than one optimal solution). (Call this ordering doubly-reverse-lexicographical because it sorts in descending order starting from the right).
Consider a job set $Y$ of size $n+1$ whose "worst" job is $J_{n+1}$, and suppose that there is no optimal solution in which this job comes first. Of the optimal solutions select one which puts $J_{n+1}$ as early as possible, in position $j^* = \pi^{-1}(n+1) > 0$. Now, which $j$ maximises the expression above (i.e. which job finishes last)?
Given that $b_{\pi(j^*)}$ is maximal by construction and the sum is non-decreasing the maximising $j$ must be $\ge j^*$. Also, if the maximising $j > j^*$ we can move $J_{n+1}$ to the start without affecting the optimality, so by contradiction that's impossible. We're forced to conclude that $J_{n+1}$ is the job which finishes last.
Now consider what happens if we swap it with the job before it. Call that job $J_p$ for previous. $J_p$'s finishing time increases from $b_p + \sum_{k < j^*}a_{\pi(k)}$ to $b_p + \sum_{k \le j^*}a_{\pi(k)} \le b_{n+1} + \sum_{k \le j^*}a_{\pi(k)}$ and $J_{n+1}$'s finishing time decreases from $b_{n+1} + \sum_{k \le j^*}a_{\pi(k)}$ to $b_{n+1} + \sum_{k \le j^* \wedge k \ne j^*-1}a_{\pi(k)}$ so we have a contradiction with this being an optimal solution which puts $J_{n+1}$ as early as possible.
We've now proven that there is an optimal solution for $Y$ in which $J_{n+1}$ comes first. Either some of these optimal solutions allow for the rest of the jobs to be ordered non-optimally because the worst job is so bad that it doesn't matter – in which case putting them in an optimal ordering will give an optimal ordering – or the rest of the jobs must be optimally ordered to get an optimal ordering. In either case we can apply our strong inductive presupposition to say that there is an optimal ordering which is doubly-reverse-lexicographical.
The base case of only one job is trivial.
Suppose the set of coin denominations is $\{a_1,\ldots,a_n\}$ where $a_1>\ldots>a_n=1$. Let $S$ be the the given amount of money.
Define $m_t=\lceil a_{t-1}/a_t\rceil$ and $S_t=m_ta_t$.
Let $Opt(S)$ (respectively $G(S)$) denote the number of coins used in an optimal (respectively,the greedy) solution. Then we have the following theorem:
If $\ \ \ S_t<a_t-2 \ $ for all $t\in\{3,\ldots,n\}\ \ \ $ then
$Opt(S)=G(S) \quad$ iff $\quad G(S_t)\leq m_t \ \ $ (for all $t\in\{2,\ldots,n\}$)
There is a simpler version which states only the sufficient condition:
$Opt(S)=G(S) \quad$ if $\quad G(S_t-a_{t-1})\leq m_t-1 \ \ $ (for all $t\in\{2,\ldots,n\}$)
Best Answer
The greedy algorithm will use $\lceil \frac nK\rceil$ coins. Any better method would use $r$ coins for some $r$ with $rK<n$, which is absurd.