Edit: The indicator random variables have been radically changed, so that one gets a very quick computation of the mean and variance.
Let $Y$ be the number of reds drawn before the first blue. Suppose that the red balls have labels $1, 2, 3,\dots,r$. Let $X_i=1$ if red ball with label $i$ is drawn before the first blue is drawn, and let $X_i=0$ otherwise.
Then $Y=X_1+\cdots+X_r$. Note that the number of draws up to an including the first blue is $Y+1$. But $Y+1$ and $Y$ have the same variance.
To calculate the variance of $X_i$, we first calculate the mean. By linearity of expectation we have
$$E(Y)=E(X_1)+E(X_2)+\cdots+E(X_r).$$
By symmetry, all the $E(X_i)$ are the same. The probability red with label $i$ comes before any of the $b$ blue is $\frac{1}{b+1}$. It follows that $E(Y)=\frac{r}{b+1}$.
To calculate the variance of $Y$, calculate $E(X_1+\cdots +X_{r})^2$ and subtract the square of $E(Y)$, which we know.
To find $E(X_1+\cdots+X_r)^2$, expand the square and use the linearity of expectation. We know the expectation of $\sum X_i^2$, since $X_i^2=X_i$. So we need the expectations of the cross terms.
For $i\ne j$, $X_iX_j=1$ if both red ball $i$ and red ball $j$ come before any blue. This has probability $\frac{2}{(b+2)(b+1)}$. Multiply by $2\binom{r}{2}$ to get the sum of the cross terms.
This can be solved using inclusion-exclusion. There are $\binom kj$ ways to choose $j$ particular sets to finish, and the probability to have completed all $j$ of them is the probability to have completed a standard coupon collection with $jn$ coupons while drawing from $kn$ coupon types. Since the expected number of draws is the sum of the non-completion probabilities over all times, it satisfies the same inclusion-exclusion relation as the probabilities. Drawing from $kn$ coupon types while collecting $jn$ increases the expected number of draws by a factor $\frac kj$. Thus the desired expectation is
\begin{align}
\sum_{j=1}^k(-1)^{j-1}\binom kj\frac kjjnH_{jn}
&=kn\sum_{j=1}^k(-1)^{j-1}\binom kjH_{jn}
\\
&=kn\sum_{j=1}^k(-1)^{j-1}\binom kj\left(\log j+\log n+\gamma+\frac1{2jn}\right)+O\left(\frac kn\right)\\
&=kn\left(\log n+\gamma\right)+\frac12kH_k+kn\sum_{j=1}^k(-1)^{j-1}\binom kj\log j+O\left(\frac kn\right)\\
&=knH_n+\frac12kH_k-k+kn\sum_{j=1}^k(-1)^{j-1}\binom kj\log j+O\left(\frac kn\right)\;.
\end{align}
For the example $n=10$, $k=2$ used in Tad's answer, this yields the approximation
$$
20\left(H_{10}-\log2\right)+H_2-2\approx44.2164\;,
$$
close to Tad's approximation.
The remaining sum is treated in Proof $\sum\limits_{k=1}^n \binom{n}{k}(-1)^k \log k = \log \log n + \gamma +\frac{\gamma}{\log n} +O\left(\frac1{\log^2 n}\right)$; substituting that expansion leads to
$$
kn\left(H_n-\log\log k-\gamma-\frac\gamma{\log k}+\frac{\pi^2+6\gamma^2}{12\log^2k}\right)+\frac12kH_k-k+O\left(\frac{kn}{\log^3k}\right)\;.
$$
Best Answer
We use the notation from the following MSE link with $m$ for the number of trials and $n$ for the number of different types of coupons. We treat the special case where there are $j$ instances of each type and we are sampling without replacement.
We ask about the probability of obtaining the distribution
$$\prod_{q=1}^n C_q^{\alpha_q}$$
where $\alpha_q$ says we have that many instances of type $C_q.$ We obtain
$$\frac{(nj-\sum_{q=1}^n \alpha_q)!}{(nj)!} \prod_{q=1}^n \frac{j!}{(j-\alpha_q)!}.$$
Therefore the sequences according to probability of length $m$ of $n$ types of coupons without replacement and a maximum of $j$ coupons of each type are given by
$$m! [z^m] \left(\sum_{k=0}^j \frac{j!}{(j-k)!}\frac{z^k}{k!}\right)^{n} = m! [z^m] (1+z)^{nj} = {nj\choose m} \times m!.$$
We then obtain from first principles the formula
$$P[T=m] = \frac{1}{m!} {nj\choose m}^{-1} \times n \times j \times (m-1)! [z^{m-1}] \left(\sum_{k=1}^j \frac{j!}{(j-k)!}\frac{z^k}{k!}\right)^{n-1} \\ = nj \times \frac{1}{m} {nj\choose m}^{-1} [z^{m-1}] \left(-1 + (1+z)^j\right)^{n-1}.$$
This becomes
$$nj \times \frac{1}{m} {nj\choose m}^{-1} [z^{m-1}] \sum_{q=0}^{n-1} {n-1\choose q} (-1)^{n-1-q} (1+z)^{qj} \\ = {nj-1\choose m-1}^{-1} \sum_{q=0}^{n-1} {n-1\choose q} (-1)^{n-1-q} {qj\choose m-1}.$$
Observe that
$${qj\choose m-1} {nj-1\choose m-1}^{-1} = \frac{(qj)! \times (nj-1-(m-1))!}{(qj-(m-1))! \times (nj-1)!} \\ = {nj-1\choose qj}^{-1} {nj-1-(m-1)\choose qj-(m-1)}.$$
We record for the probabilities the formula
$$\bbox[5px,border:2px solid #00A000]{ P[T=m] = \sum_{q=0}^{n-1} {n-1\choose q} (-1)^{n-1-q} {nj-1\choose qj}^{-1} {nj-1-(m-1)\choose nj-1-qj}.}$$
Start by verifying that this is a probability distribution. We obtain for the sum in $m$
$$\sum_{m=n}^{(n-1)j+1} {nj-1-(m-1)\choose nj-1-qj} \\ = [z^{nj-1-qj}] \sum_{m=n}^{(n-1)j+1} (1+z)^{nj-1-(m-1)} \\ = [z^{nj-1-qj}] \sum_{m=j-1}^{n(j-1)} (1+z)^m = [z^{nj-qj}] ((1+z)^{n(j-1)+1} - (1+z)^{j-1}).$$
We have $nj-qj\ge j$ so only the first term contributes and we obtain
$$\sum_m P[T=m] = \sum_{q=0}^{n-1} {n-1\choose q} (-1)^{n-1-q} {nj-1\choose qj}^{-1} {n(j-1)+1\choose nj-qj} \\ = \frac{n(j-1)+1}{j} \sum_{q=0}^{n-1} {n-1\choose q} \frac{(-1)^{n-1-q}}{n-q} {nj-1\choose nj-qj-1}^{-1} {n(j-1)\choose nj-qj-1}$$
We get for the rightmost pair of binomial coefficients
$$\frac{(n(j-1))! \times (qj)!}{(nj-1)! \times (qj+1-n)!} = {nj-1\choose n-1}^{-1} {qj\choose n-1}$$
which yields for the sum
$$\frac{n(j-1)+1}{j} {nj-1\choose n-1}^{-1} \sum_{q=0}^{n-1} {n-1\choose q} \frac{(-1)^{n-1-q}}{n-q} {qj\choose n-1} \\ = \frac{n(j-1)+1}{nj} {nj-1\choose n-1}^{-1} \sum_{q=0}^{n-1} {n\choose q} (-1)^{n-1-q} {qj\choose n-1} \\ = \frac{n(j-1)+1}{nj} {nj-1\choose n-1}^{-1} {nj\choose n-1} \\ + \frac{n(j-1)+1}{nj} {nj-1\choose n-1}^{-1} \sum_{q=0}^{n} {n\choose q} (-1)^{n-1-q} {qj\choose n-1} \\ = \frac{n(j-1)+1}{nj}\frac{nj}{nj+1-n} \\ + \frac{n(j-1)+1}{nj} {nj-1\choose n-1}^{-1} [z^{n-1}] \sum_{q=0}^{n} {n\choose q} (-1)^{n-1-q} (1+z)^{qj} \\ = 1 - \frac{n(j-1)+1}{nj} {nj-1\choose n-1}^{-1} [z^{n-1}] (1-(1+z)^j)^n$$
Now observe that $[z^{n-1}] (1-(1+z)^j)^n = 0$ hence everything simplifies to $$1$$ and we have a probability distribution.
Continuing with the expectation we have the following closed form:
$$\bbox[5px,border:2px solid #00A000]{ \mathrm{E}[T] = \sum_{q=0}^{n-1} {n-1\choose q} (-1)^{n-1-q} {nj-1\choose qj}^{-1} \sum_{m=n}^{(n-1)j+1} m {nj-1-(m-1)\choose nj-1-qj}.}$$
By means of plotting strategy let us examine some of these. Here are the first few for eight types of coupons starting at $j=1:$
$$8,{\frac {76627}{6435}},{\frac {76801}{5434}},{\frac {7473667}{480675}}, {\frac {1318429}{79794}},\ldots$$
and here is the initial segment for ten types of coupons:
$$10,{\frac {707825}{46189}},{\frac {7008811}{380380}}, {\frac {266299459}{13042315}},{\frac {182251913}{8360638}}, {\frac {748880445829}{32831263465}},\ldots$$
Careful inspection of these values reveals that we cannot hope for additional simplification when $j\ge 2$ because if it were possible it would have appeared in these sample values. We do see however that the case $j=1$ should be possible, the value being $n$ (we always finish after $n$ draws if there is only one instance of each coupon).
We now do this calculation, which is trivial, but nontheless a useful sanity check, starting with
$$\sum_{q=0}^{n-1} {n-1\choose q} (-1)^{n-1-q} {n-1\choose q}^{-1} \sum_{m=n}^{n} m {n-1-(m-1)\choose n-1-q} \\ = \sum_{q=0}^{n-1} (-1)^{n-1-q} \times n {n-1-(n-1)\choose n-1-q} \\ = (-1)^{n-1-(n-1)} \times n \times {0\choose n-1-(n-1)} = n.$$
It certainly seems like a worthwhile challenge to prove that the closed form for $\mathrm{E}[T]$ is $n H_n$ in the limit, which is confirmed by the numerical evidence.
We did verify the formula for the expectation in software, as follows. It really is quite remarkable that the output from this program is in excellent agreement with the closed form on all values that were tested.