Now, from a bin select a single ball, if a bin contains multiple balls a single ball is chosen uniformly at random.
And what happens if the bin you choose is empty?
The basis of your calculation is the implicit assumption that in that case you don't select any ball at all. A perfectly reasonable assumption, by the way. It means we really have three possible events we could consider:
\begin{align}
B &= \text{You select a black ball.}\\
W &= \text{You select a white ball.}\\
E &= \text{You select a no ball.}\\
\end{align}
The probability of event $E$ is easy to figure out. It is the probability that the bin is empty:
$$ \mathbb P(E) = \left(1-\frac{1}{M}\right)^{n+k}. $$
But suppose the bin is not empty. Given that the bin is not empty, which occurs with probability $1 - \mathbb P(E),$
there are $n+k$ balls in the problem, any one of which could first have been placed in the bin and then selected from it.
Label the black balls $1,\ldots,n$ and the white balls $1,\ldots,k$;
let $B_j$ be the event of drawing black ball $j$ and let $W_i$ be the event of drawing white ball $i.$
Given that the bin is not empty, no ball has a better chance than any other ball of being the ball eventually selected from the chosen bin, so
$\mathbb P(B_j \mid E^\complement) = \mathbb P(W_i \mid E^\complement)
= \mathbb P(B_1 \mid E^\complement)$ for all $i$ and $j$.
Since the events $B_j$ and $W_i$ are disjoint and one of them must happen given that the bin is not empty,
by the law of total probability,
$$ 1 = \sum_{j=1}^n \mathbb P(B_j \mid E^\complement)
+ \sum_{i=1}^k \mathbb P(W_i \mid E^\complement)
= (n + k) \mathbb P(B_1 \mid E^\complement), $$
that is, each ball has conditional probability $1/(n+k)$ of being selected.
But there are $n$ black balls, so
$$ \mathbb P(B \mid E^\complement) = \frac n{n+k}. $$
Therefore
\begin{align}
\mathbb P(B)
&= \mathbb P(B \mid E^\complement) \mathbb P(E^\complement) \\
&= \mathbb P(B \mid E^\complement) (1 - \mathbb P(E)) \\
&= \frac n{n+k} \left(1 - \left(1-\frac1M\right)^{n+k}\right).
\end{align}
Now it would be nice to know how to get this from the expression you put together earlier. But first that expression has to be correct.
I agree with your formula for $\mathbb P(X_r = s),$
but a formula for $\mathbb P(X=t)$ would have to account for all the different ways you might have $t$ balls in the bin, which your formulas never explicitly do (and it's just as well that they don't).
What you actually seem to be trying to show in your next formula is
the probability of $i$ white balls and $j$ black balls,
$\mathbb P(X_w=i,X_b=j).$
Moreover, you mistakenly swapped $n$ and $k$ in the binomial coefficients:
it's $i$ white balls out of $k$ and $j$ black balls out of $n$,
so the formula should be
$$ \mathbb P(X_w=i,X_b=j) = \binom ki \binom nj \left(\frac1M \right)^{i+j}
\left(1 - \frac1M\right)^{k-i+n-j}. $$
I have written this without the symbols $N$ and $t$ since I don't think they're actually helpful at this point.
The next formula you wanted to show was not the probability that you choose a black ball given that there are $i$ white balls and $j$ black balls (which would just be $j/(i+j)$), it was the probability that you choose a black ball and there are $i$ white balls and $j$ black balls:
$$ \mathbb P(B,X_w=i,X_b=j) =
\frac{j}{i+j} \binom ki \binom nj \left(\frac1M \right)^{i+j}
\left(1 - \frac1M\right)^{k-i+n-j}. $$
So now formula (*), correctly written, is
$$ \mathbb P(B) = \sum_{i=0}^k \sum_{j=1}^n
\frac{j}{i+j} \binom ki \binom nj \left(\frac1M \right)^{i+j}
\left(1 - \frac1M\right)^{k-i+n-j}. \tag1 $$
Now what I'd like to do is to rearrange the order of summation so the outer variable is the total number of balls, $t = i+j.$
To cover terms of the sum in Equation $(1)$
we need $t$ to range from at least $1$ to $n+k.$
If we take the convention that $\binom xy = 0$ when $y > x,$
and also set $p = \frac1M$ and $q = 1 - \frac1M,$
the sum can be written this way:
$$
\mathbb P(B) = \sum_{t=1}^{n+k} \sum_{j=1}^t
\frac jt \binom k{t-j} \binom nj p^t q^{n+k-t}. \tag2
$$
The sum in Equation $(2)$ generally has more terms than the sum in Equation $(1),$ but the extra terms all occur for the cases $j > n$ and $t - j > k,$
and in these cases either $\binom nj = 0$ or $\binom k{t-j} = 0,$
so the extra terms conveniently vanish.
(Alternatively, we could deal with the $j > n$ case by setting the inner sum's upper bound at $j = \max\{n,t\}$ and deal with the $t - j > k$ case by applying other jiggery-pokery to the lower bound, but the zero terms let us keep the bounds simple.)
Since the powers of $p$ and $q$ do not depend on $j,$
we can factor them out of the inner sum, along with the factor $\frac1t$:
$$
\mathbb P(B) = \sum_{t=1}^{n+k} p^t q^{n+k-t} \cdot \frac1t
\sum_{j=1}^t j \binom k{t-j} \binom nj .
$$
Now to simplify the inner sum, suppose we randomly select $t$ balls from a collection of $n$ black balls and $k$ white balls and let $Y$ be the number of black balls chosen.
The expected number of black balls is
$$
\mathbb E(Y) = \sum_{j=0}^t j\, \mathbb P(Y = j)
= \sum_{j=0}^t j \frac{\binom k{t-j} \binom nj}{\binom{n+k}t},
$$
again using the convention $\binom xy = 0$ when $y > x$ to deal with the terms where either $j$ or $t-j$ is too large.
But setting $Y_m=1$ if the $m$th ball drawn is black, $Y_m=0$ otherwise,
$\mathbb E(Y_m) = \frac{n}{n+k}$ for $1 \leq m \leq t$ and therefore by linearity of expectation,
$$
\mathbb E(Y) = \sum_{m=1}^t \mathbb E(Y_m) = \frac{tn}{n+k}.
$$
Setting the far right sides of both equations for $\mathbb E(Y)$ equal
and multiplying both by $\frac1t \binom{n+k}t,$ we conclude that
$$
\frac1t \sum_{j=0}^t j \binom k{t-j} \binom nj = \frac{n}{n+k} \binom{n+k}t.
$$
We can therefore further simplify Equation $(2)$ as follows:
\begin{align}
\mathbb P(B)
&= \sum_{t=1}^{n+k} p^t q^{n+k-t} \frac{n}{n+k} \binom{n+k}t \\
&= \frac{n}{n+k} \sum_{t=1}^{n+k} \binom{n+k}t p^t q^{n+k-t} \\
&= \frac{n}{n+k} \left(
\left(\sum_{t=0}^{n+k} \binom{n+k}t p^t q^{n+k-t}\right)
- q^{n+k} \right)\\
&= \frac{n}{n+k} \left( (p + q)^{n+k} - q^{n+k} \right)\\
&= \frac{n}{n+k} \left(1 - q^{n+k} \right)\\
\end{align}
In short, by adding up all the terms generated by formula (*), we conclude that
$$
\mathbb P(B) = \frac n{n+k} \left(1 - \left(1-\frac1M\right)^{n+k}\right),
$$
in agreement with the argument in the first part of this answer.
Best Answer
Since you are OK with not using the Lovasz lemma, we can apply the simple method used in this answer. There is nothing special about $2/3$; for any constant $C$, the probability there exists a bin with $C\log n$ balls is $o(1)$.
Let $k\ge 1$ be any integer, let $M$ be the maximum number of balls in any bin, and let $Z$ be the number of balls in the first bin. Using the union bound, $$ P(M\ge k)\le n\cdot P(Z\ge k) $$ In order for $Z\ge k$ to occur, there must exist a subset of $k$ balls which all fall in the first bin. There are $\binom nk$ subsets of $k$ balls, and the probability that $k$ particular balls land in the first bin is $n^{-k}$, so using the union bound again, $$ P(Z\ge k) \le \binom{n}{k}\cdot \frac1{n^k}=\frac{1}{k!}(1+o(1))\qquad \text{as $n\to\infty$}$$ Therefore, $$ P(M\ge k)\le \frac{n}{k!}(1+o(1)) $$ Using Stirling's approximation, you can show that if $k$ grows like $ C\log n$ for any constant $C>0$, then $n/k!\to 0$. \begin{align} \log(n/k!) &=\log n-\log(k!) \\&=\log n-k\log k+O(k) \\&=\log n-(C\log n)\log(C\log n)+O(\log n) \\&=-C\log n\cdot \log\log n+O(\log n) \end{align} The dominant term is $-C\log n \cdot \log \log n$, which approaches $-\infty$. Since $\log(n/k!)\to -\infty$, we get $n/k!\to 0$.
We conclude that $P(M\ge k)\to 0$, which is equivalent to $P(M\ge k)=o(1)$.