Minimum of Pairwise-Independent Variables

combinatoricsindependenceprobabilityprobability theory

Given $X_{1},\dots, X_{n}$ which are uniform on unit interval $[0,1]$, and pairwise independent, i.e. $X_{i}$ and $X_{j}$ are independent for $1\leq i,j\leq n$. Show that $$\Theta\left(\frac{1}{n}\right)\leq \mathbb{E}[\min(X_{1},\dots,X_{n})] \leq \Theta\left(\frac{\log{n}}{n}\right).$$

For the lower bound by Markov's inequality,
$$\Pr(\min(X_{1},\dots,X_{n})\geq a) \leq \frac{\mathbb{E}[\min(X_{1},\dots,X_{n})]}{a}$$
By Union Bound,
$$\Pr(\min(X_{1},\dots,X_{n})\geq a) = \Pr(X_{1}\geq a,\dots,X_{n}\geq a)\geq 1-\sum_{i}\Pr(X_{i}\leq a)=1-n\cdot a.$$
Plugging back in
$$\mathbb{E}[\min(X_{1},\dots,X_{n})]\geq \max_{0\leq a\leq 1}{a(1-na)}=\frac{1}{4n}.$$ For uniform i.i.d. variables we establish that the lower bound is tight.
I don't know how to prove the upper bound.

Best Answer

I can give an upper bound of $O\left(\frac{\log n}{n}\right)$.

For $t\in(0,1)$ let $N_t$ be the number of samples among $X_1,\dots,X_n$ that are at most $t$. Then $\mathbb EN_t=nt$ and pairwise independence implies that $\operatorname{Var}N_t=nt(1-t)$. Then by the second moment method $$\mathbb P(\min X_i\leq t)=\mathbb P(N_t>0)\geq\frac{\mathbb E[N_t]^2}{\mathbb E[N_t^2]}=\frac{nt}{nt+(1-t)}.$$ So we have $$\mathbb E[\min X_i]=\int_0^1\mathbb P(\min X_i>t)\mathop{}\!\mathrm{d}t\leq\int_0^1\frac{1-t}{(n-1)t+1}\mathop{}\!\mathrm{d}t=\frac{n\log n}{(n-1)^2}-\frac{1}{n-1},$$ establishing the upper bound.

Related Question