Clarification of probability bound in terms of $(\epsilon, \delta)$

epsilon-deltaprobabilityupper-lower-bounds

I am trying to understand the answer to an exercise for probability bounds that was previously posted on MathOverflow. The question is:

What is a lower bound on the number of trials you have to perform in order to distinguish two biased coins: one with $p(H) = \frac{1}{2} + \epsilon$, and the other with $p(H) = \frac{1}{2} – \epsilon$ ?

After making use of some Chernoff bounds, the accepted answer states:

If we plug in and solve $\delta\leq P(A)\leq2\exp\left(-\frac{2\cdot\epsilon^{2}}{n}\right)$, then $n\ge\frac{1}{\epsilon^{2}}log(\frac{1}{\delta})$. [1]

It is not clear to me how to actually get from that to the result of $n\ge\frac{1}{\epsilon^{2}}log(\frac{1}{\delta})$. Are there any assumptions being made and not stated? What are the exact steps in order to derive $n$ in terms of $(\epsilon, \delta)$?

Thank you!


[1] Henry.L (https://mathoverflow.net/users/25437/henry-l), Lower bound on number of samples for an epsilon delta approximation matching the Chernoff bound, URL (version: 2017-05-04): https://mathoverflow.net/q/269027

Best Answer

So first, it is OK to ignore the $1/2+\epsilon$ coin entirely and focus only on the $1/2-\epsilon$ coin. This is because we correctly identify both coins if and only if we correctly identify either one, and the biases are of equal magnitude.

Now the idea is to assert the statement "the coin is biased in favor of tails" exactly when the event $B:=$"in $n$ flips of the coin, less than $n/2$ flips were heads" occurs. Thus, you want to have $n$ such that, if the coin is actually biased in favor of tails, $B$ occurs with probability at least $1-\delta$. Conversely, if it is biased in favor of heads, you want $B^c$ to occur with probability at least $1-\delta$. (You can assign an arbitrary conclusion to the estimator when $n$ is even and you get exactly the same number of heads and tails; asymptotically this has vanishingly small probability.)

Now because the biases are symmetric, it suffices to have $P(B) \geq 1-\delta$ when the coin is biased in favor of tails.

So finally, having set up the problem, we want a "reasonably" tight pair of bounds on $P(X \geq n/2)$, which is $P(B^c)$ in the notation above, when $X$ is binomial($n,1/2-\epsilon$) distributed (i.e. when the coin is biased in favor of tails).

The Chernoff bound itself gives an upper bound of $e^{-2 \epsilon^2 n}$. A less standard bound is given here: https://ece.uwaterloo.ca/~nmousavi/Papers/Chernoff-Tightness.pdf in Eq. (4), this gives a lower bound of $\frac{1}{4} e^{-\frac{2}{1/2-\epsilon} \epsilon^2 n}$. (Note that in the correspondence between the notations, $m=n,t=\epsilon n,p=1/2-\epsilon$ so indeed $\epsilon n \leq n(1-2p)=2\epsilon n$, as required by the statement.)

Therefore, to make the correct assignment with probability $1-\delta$, it is sufficient to have $n>\frac{\log(\delta^{-1})}{2\epsilon^2}$ and necessary to have $n>\frac{(1/2-\epsilon) \log(4\delta^{-1})}{2\epsilon^2}$, which is "morally" a factor of two smaller, assuming $\epsilon \ll 1/2$ and $\delta \ll 1/4$.

Related Question