Concentration bounds for dependent Bernoulli random variables

concentration-of-measureprobabilitystochastic-processes

Let $X_0,X_1,…,X_T$ be independent random variables from $\mathcal{N}(0,1)$.
Define $Y_t=\mathbb{1}\{ X_t > \max_{i=0,…,t-1}X_i \}$, then $\mathbb{E}[Y_t]=\text{Pr}(X_t > \max_{i=0,…,t-1}X_i)$.

What I want is a bound similar to Chernoff, i.e.,
$$\text{Pr}(\sum_{t=1}^{T}Y_t \geq (1+\delta)\sum_{t=1}^{T}\mathbb{E}[Y_t]) \leq \text{"a small value"}.$$

The method of this question can give the explicit form of $\sum_{t=1}^{T}\mathbb{E}[Y_t]$.

However, $Y_t$'s are not independent of each other, and martingale inequality doesn't seem to work here. To be specific, this question may be useful. But this bound holds for conditional expectations.

If the upper bound of Chernoff-type is difficult to prove, is there a more relaxed bound?

I have done a lot of experiments with $T \in [10^3,10^8]$, and the results show that $\sum_{t=1}^{T}Y_t \leq 2\sum_{t=1}^{T}\mathbb{E}[Y_t]$ is always true. The results are consistent when $X_i$'s come from a uniform distribution.

Best Answer

From symmetry, since any of $X_0,X_1,\dots,X_t$ is equally likely to be the largest one (and there are no ties as $X$s are continuous random variables) we get $\mathbb{P}(Y_t=1)=\frac1{t+1}$, so if $S=\sum_{t=1}^{T}Y_t$ then $\mathbb{E}[S]=\frac12+\frac13+\dots+\frac1{T+1}=\ln(T)+\gamma-1+o(1)$.

Also, $Y_i$'s are independent, as for $j>i$ the fact that $X_j$ is the largest amongst $X_0,X_1,\dots,X_i,X_{i+1},\dots, X_j$ is irrelevant to the fact that $X_i$ is the largest amongst $X_0,X_1,\dots,X_i$; the second largest amongst $X_0,X_1,\dots, X_j$ is equally likely to be any of $X_0,X_1,\dots, X_{j-1}$.

Hence $Var(S)=\sum_{k=2}^{T+1} \frac1k\left(1-\frac1k\right)=\mathbb{E}[T]+O(1)$.

Now you can use Chebyshev's inequality.

Related Question