2

Suppose we throw uniformly and independently $n$ balls into $n$ bins. Then for $\epsilon > 0$ holds

$$\mathbb{P}\bigg[\exists \text{ a bin with at least } 1+\biggl(\frac{2}{3}+\epsilon \biggr) \ln n \text{ balls} \bigg] = \mathcal{o}(1)$$

In class we have just done Lovasz Lemma (the assymetric case as Wikipedia calls it), but I do not see how to use this here. Could you please give me a hint?

EDIT: Using leonbloy's suggestion I have come up with the following:

Let $X_i$ be the event that the $i$-th bin has at least $1+\biggl(\frac{2}{3}+\epsilon \biggr) \ln n$ balls, so we have

$$\mathbb{P}\bigg[\exists \text{ a bin with at least } 1+\biggl(\frac{2}{3}+\epsilon \biggr) \ln (n) \text{ balls} \bigg] =\mathbb{P} [ \cup_{i = 1}^n X_i ] \le \sum_{i=1}^n \mathbb{P}[X_i] = n \cdot \mathbb{P}[X_1]$$

Since $X_1 \sim \operatorname{Bin}(n,1/n)$ we have $\mathbb{E}[X_1] =1$. However, I fail to use Chernoff's bound to estimate $\mathbb{P}[X_1]$:

$$\mathbb{P}\bigg[X_1 \ge 1 + \biggl(\frac{2}{3}+\epsilon \biggr) \ln (n) \bigg] \le \exp \Bigg(- \frac{\biggl(\frac{2}{3}+\epsilon \biggr)^2 \ln^2 (n) }{2\bigg(1 + \biggl(\frac{2}{9}+\frac{\epsilon}{3} \biggr) \ln (n)\bigg)}\Biggr)$$

I do not see what to do with this term. Could you please give me another hint?

3nondatur
  • 4,178

1 Answers1

2

Since you are OK with not using the Lovasz lemma, we can apply the simple method used in this answer. There is nothing special about $2/3$; for any constant $C$, the probability there exists a bin with $C\log n$ balls is $o(1)$.

Let $k\ge 1$ be any integer, let $M$ be the maximum number of balls in any bin, and let $Z$ be the number of balls in the first bin. Using the union bound, $$ P(M\ge k)\le n\cdot P(Z\ge k) $$ In order for $Z\ge k$ to occur, there must exist a subset of $k$ balls which all fall in the first bin. There are $\binom nk$ subsets of $k$ balls, and the probability that $k$ particular balls land in the first bin is $n^{-k}$, so using the union bound again, $$ P(Z\ge k) \le \binom{n}{k}\cdot \frac1{n^k}=\frac{1}{k!}(1+o(1))\qquad \text{as $n\to\infty$}$$ Therefore, $$ P(M\ge k)\le \frac{n}{k!}(1+o(1)) $$ Using Stirling's approximation, you can show that if $k$ grows like $ C\log n$ for any constant $C>0$, then $n/k!\to 0$. \begin{align} \log(n/k!) &=\log n-\log(k!) \\&=\log n-k\log k+O(k) \\&=\log n-(C\log n)\log(C\log n)+O(\log n) \\&=-C\log n\cdot \log\log n+O(\log n) \end{align} The dominant term is $-C\log n \cdot \log \log n$, which approaches $-\infty$. Since $\log(n/k!)\to -\infty$, we get $n/k!\to 0$.

We conclude that $P(M\ge k)\to 0$, which is equivalent to $P(M\ge k)=o(1)$.

Mike Earnest
  • 75,930
  • Thanks for your answer. However, I do not quite understand the last part. What do you mean with using Stirlings approximation? The Stirling's approximation I know says $\log(n!) = n \log(n) - n + \Theta(\ln(n))$, so I do not quite see how you use this here. I also do not quite see why $P(M \ge k) \rightarrow 0$ should imply $P(M \ge k) = \mathcal{o}(1)$. – 3nondatur Dec 04 '22 at 21:26
  • 1
    @3nondatur Ok, I included the Stirling's approximation arithmetic. I used a weaker version, $\log(n!)=n\log n+O(n)$.$\tag*{}$ As an exercise, you should prove that for any sequence $(a_k)_{k\ge 0}$, the condition $a_k\to 0$ is equivalent to $a_k=o(1)$. This is an important equivalence to be familiar with. – Mike Earnest Dec 04 '22 at 21:47
  • Thanks a lot, I got I now. – 3nondatur Dec 04 '22 at 23:14