In the theoretical analysis of some algorithm for stochastic optimization, we often need to prove that something like
$$error\leq\epsilon,~~~~(under~some~conditions)$$
holds with probability at least $1-\delta$. But I found that many works use the specific number like $\delta = 1/4$ in their result even if the more general conclusion for arbitrary $\delta\in(0, 1/2]$ has been made. I was wondering that is there any particular meaning assigned to this specific number.
The meaning of with probability at least 1-\delta
machine learningoptimizationprobabilitystatistics
Best Answer
As long as $1-\delta>1/2$ can be achieved, then repeatedly apply the algorithm, and by the law of large numbers, we can make the probability of error $\le \epsilon$ as large as possible, so it's not very important what $\delta$ is as long as $\delta<1/2$. (Although sometimes, it's interesting to optimize $\delta$ for specific $\epsilon$.)
This is really universal in TCS. For example in the definition of BPP, the number of $1/3$ and $2/3$ can really be replaced by $\epsilon$ and $1-\epsilon$ for any $\epsilon<1/2$.