Probability of a bin with at least $1+\biggl(\frac{2}{3}+\epsilon \biggr) \ln n$ balls

balls-in-binscombinatoricsprobability

Suppose we throw uniformly and independently $n$ balls into $n$ bins. Then for $\epsilon > 0$ holds

$$\mathbb{P}\bigg[\exists \text{ a bin with at least } 1+\biggl(\frac{2}{3}+\epsilon \biggr) \ln n \text{ balls} \bigg] = \mathcal{o}(1)$$

In class we have just done Lovasz Lemma (the assymetric case as Wikipedia calls it), but I do not see how to use this here. Could you please give me a hint?

EDIT: Using leonbloy's suggestion I have come up with the following:

Let $X_i$ be the event that the $i$-th bin has at least $1+\biggl(\frac{2}{3}+\epsilon \biggr) \ln n$ balls, so we have

$$\mathbb{P}\bigg[\exists \text{ a bin with at least } 1+\biggl(\frac{2}{3}+\epsilon \biggr) \ln (n) \text{ balls} \bigg] =\mathbb{P} [ \cup_{i = 1}^n X_i ] \le \sum_{i=1}^n \mathbb{P}[X_i] = n \cdot \mathbb{P}[X_1]$$

Since $X_1 \sim \operatorname{Bin}(n,1/n)$ we have $\mathbb{E}[X_1] =1$. However, I fail to use Chernoff's bound to estimate $\mathbb{P}[X_1]$:

$$\mathbb{P}\bigg[X_1 \ge 1 + \biggl(\frac{2}{3}+\epsilon \biggr) \ln (n) \bigg] \le \exp \Bigg(- \frac{\biggl(\frac{2}{3}+\epsilon \biggr)^2 \ln^2 (n) }{2\bigg(1 + \biggl(\frac{2}{9}+\frac{\epsilon}{3} \biggr) \ln (n)\bigg)}\Biggr)$$

I do not see what to do with this term. Could you please give me another hint?

Best Answer

Since you are OK with not using the Lovasz lemma, we can apply the simple method used in this answer. There is nothing special about $2/3$; for any constant $C$, the probability there exists a bin with $C\log n$ balls is $o(1)$.

Let $k\ge 1$ be any integer, let $M$ be the maximum number of balls in any bin, and let $Z$ be the number of balls in the first bin. Using the union bound, $$ P(M\ge k)\le n\cdot P(Z\ge k) $$ In order for $Z\ge k$ to occur, there must exist a subset of $k$ balls which all fall in the first bin. There are $\binom nk$ subsets of $k$ balls, and the probability that $k$ particular balls land in the first bin is $n^{-k}$, so using the union bound again, $$ P(Z\ge k) \le \binom{n}{k}\cdot \frac1{n^k}=\frac{1}{k!}(1+o(1))\qquad \text{as $n\to\infty$}$$ Therefore, $$ P(M\ge k)\le \frac{n}{k!}(1+o(1)) $$ Using Stirling's approximation, you can show that if $k$ grows like $ C\log n$ for any constant $C>0$, then $n/k!\to 0$. \begin{align} \log(n/k!) &=\log n-\log(k!) \\&=\log n-k\log k+O(k) \\&=\log n-(C\log n)\log(C\log n)+O(\log n) \\&=-C\log n\cdot \log\log n+O(\log n) \end{align} The dominant term is $-C\log n \cdot \log \log n$, which approaches $-\infty$. Since $\log(n/k!)\to -\infty$, we get $n/k!\to 0$.

We conclude that $P(M\ge k)\to 0$, which is equivalent to $P(M\ge k)=o(1)$.

Related Question