If you want the expected value, one answer is $n E[S_{(m)}]$, where $S_{(m)}$ is the $m$th order statistic of a sample of $n$ gamma$(k,1)$ random variables. While this expression may not have a simple closed form, you may be able to get a decent-sized approximate answer from the literature on moments of order statistics. (Edit: This appears to be the case, even when comparing with the known asymptotic expression for the case $m=n$. See discussion at end.)
Here's the argument for $n E[S_{(m)}]$: Take a Poisson process $P$ with rate 1 and interarrival times $Z_1, Z_2, \ldots$. Let each event in the process $P$ have probability of $1/n$ of being the first kind of coupon, probability $1/n$ of being the second kind of coupon, and so forth. By the decomposition property of Poisson processes, we can then model the arrival of coupon type $i$ as a Poisson process $P_i$ with rate $1/n$, and the $P_i$'s are independent. Denote the time until process $P_i$ obtains $k$ coupons by $T_i$. Then $T_{i}$ has a gamma$(k,1/n)$ distribution. The waiting time until $m$ processes have obtained $k$ coupons is the $m$th order statistic $T_{(m)}$ of the iid random variables $T_1, T_2, \ldots, T_n$. Let $N_m$ denote the total number of events in the processes at time $T_{(m)}$. Thus $N_m$ is the random variable the OP is interested in. We have
$$T_{(m)} = \sum_{r=1}^{N_m} Z_r.$$
Since $N_m$ and the $Z_r$'s are independent, and the $Z_r$ are iid exponential(1), we have
$$E[T_{(m)}] = E\left[E\left[\sum_{r=1}^{N_m} Z_r \bigg| N_m \right] \right] = E\left[\sum_{r=1}^{N_m} E[Z_r] \right] = E\left[N_m \right].$$
By scaling properties of the gamma distribution, $T_i = n S_i$, where $S_i$ has a gamma$(k,1)$ distribution. Thus $T_{(m)} = n S_{(m)}$, and so $E\left[N_m \right] = n E[S_{(m)}]$.
For more on this idea, see Lars Holt's paper "On the birthday, collectors', occupancy, and other classical urn problems," International Statistical Review 54(1) (1986), 15-27.
(ADDED: Looked up literature on moments of order statistics.)
David and Nagaraja's text Order Statistics (pp. 91-92) implies the bound
$$n P^{-1}\left(k,\frac{m-1}{n}\right) \leq n E[S_{(m)}] \leq n P^{-1}\left(k,\frac{m}{n}\right),$$
where $P(k,x)$ is the regularized incomplete gamma function.
Some software programs can invert $P$ for you numerically. Trying a few examples, it appears that the bounds given by David and Nagaraja can be quite tight. For example, taking $n$ = 100,000, $m$ = 50,000, and $k$ = 25,000, the two bounds give estimates (via Mathematica) around $2.5 \times 10^9$, and the difference between the two estimates is about 400. More extreme values for $k$ and $m$ give results that are not as good, but even values as extreme as $m$ = 10, $k$ = 4 with $n$ = 100,000 still yield a relative error of less than 3%. Depending on the precision you need, this might be good enough.
Moreover, these bounds seem to give better results for $m \approx n$ versus using the asymptotic expression for the case $m = n$ given in Flajolet and Sedgewick's Analytic Combinatorics as an estimate. The latter has error $o(n)$ and appears to be for fixed $k$. If $k$ is small, the asymptotic estimate is within or is quite close to the David and Nagaraja bounds. However, for large enough $k$ (say, on the order of $n$) the error in the asymptotic is on the order of the size of estimate, and the asymptotic expression can even produce a negative expected value estimate. In contrast, the bounds from the order statistics approach appear to get tighter when $k$ is on the order of $n$.
(Caution: There are two versions of the regularized incomplete gamma function: the lower one $P$ that we want with bounds from $0$ to $x$, and the upper one $Q$ with bounds from $x$ to $\infty$. Some software programs use the upper one.)
Best Answer
Note. At the time of writing, the accepted answer to this question was that of Douglas Zare.
The accepted answer to this question is incorrect, albeit for what appears to be a relatively minor reason. I discovered this while answering a special case of the same question over at math.SE, where it was observed that in the special case $b=3$ and $n=10$, the formula from this post gave an absurdly large value.
After consulting the literature myself, I found the correct formula in Theorem 2 of Stadje [1990] (specifically equation (2.15) therein) with $p=1$ and $l=s=n$ and $m=b$. The desired expectation equals $$ \binom{n}{b}\sum_{j=0}^{n-1}\frac{(-1)^{n-j+1}\binom{n}{j}}{\binom{n}{b}-\binom{j}{b}}. $$
Comparing this formula with the (incorrect) one from the accepted answer, we change variables in the sum to $s=n-j$, yielding
$$ \sum_{s=1}^{n}\frac{(-1)^{s+1}\binom{n}{s}}{1-\binom{n-s}{b}/\binom{n}{b}}, $$ whereas the incorrect accepted answer states $$ \sum_{s=1}^{n-b}\frac{(-1)^{s+1}\binom{n}{s}}{1-\binom{n-s}{b}/\binom{n}{b}}. $$ The only difference is in the range of the summation, and it arises due to a mistake made by the accepted answer in the $i=0$ case of the following manipulation: $$ \sum_{S\not=\varnothing}(-1)^{|S|+1}\left(\frac{\binom{n-|S|}{b}}{\binom{n}{b}}\right)^i\overset{!}{=}\sum_{s=1}^{n-b}(-1)^{s+1}\binom{n}{s}\left(\frac{\binom{n-s}{b}}{\binom{n}{b}}\right)^i. $$ The equality holds when $i\not=0$, but when $i=0$ the terms with $s>n-b$ do in fact contribute. The corrected formula reads as follows: $$ \sum_{S\not=\varnothing}(-1)^{|S|+1}\left(\frac{\binom{n-|S|}{b}}{\binom{n}{b}}\right)^i=\sum_{s=1}^{n}(-1)^{s+1}\binom{n}{s}\left(\frac{\binom{n-s}{b}}{\binom{n}{b}}\right)^i. $$ The rest of the reasoning in the accepted answer is correct, and with this fix yields the correct answer.