Hoeffding’s Inequality – Proof of Corollary in Probability and Mathematical Statistics

hoeffdings-inequalitymathematical-statisticsprobabilityprobability-inequalities

I need to proof a corollary of Hoeffding's inequality, and since I'm not used to doing proofs I really don't know where to begin.

Hoeffding's inequality:
Let $X_1,…,X_n$ be independent real-valued random variables, such that for each $i \in \{1,…,n\}$ there exists $a_i\leq b_i$, such that $X_i \in [a_i, b_i]$. Then for every $\varepsilon >0$:
\begin{equation}
P(\sum_{i=1}^nX_i – E[\sum_{i=1}^nX_i] \geq \varepsilon) \leq e^{-2 \varepsilon^2 / \sum_{i=1}^n (b_i-a_i)^2}
\end{equation}

If we assume that $X_i$'s are identically distributed and belong to the $[0,1]$ interval we obtain the following corollary.

Corollary: Let $X_1,…,X_n$ be independent random variables, such that $X_i\in [0,1]$ and $E[X_i]=\mu$ for all $i$, then for every $\varepsilon >0$:

\begin{equation}
P(\frac{1}{n}\sum_{i=1}^nX_i – \mu \geq \varepsilon) \leq e^{-2n\varepsilon^2}
\end{equation}

So I need to prove that the corollary follows from Hoeffding's inequality. Could anyone please share some reference of proof or just prove it here?

Best Answer

With $$ \begin{align} & \mathbb{E}\left(\sum_{i=1}^nX_i\right) = n \cdot \mu, \\ & a_i=0, \\ & b_i=1, \\ & \tilde{\varepsilon} \mathrel{:=} n\cdot\varepsilon, \end{align} $$ we have $$ \mathbb{P}\left(\sum_{i=1}^nX_i-\mathbb{E}\left(\sum_{i=1}^nX_i\right)\geq \tilde{\varepsilon}\right) \leq \exp\left(-\frac{2n^2\varepsilon^2}{\sum_{i=1}^n1^2}\right) = \exp\left(-2n\varepsilon^2\right) $$ and $$ \mathbb{P}\left(\sum_{i=1}^nX_i-\mathbb{E}\left(\sum_{i=1}^nX_i\right)\geq \tilde{\varepsilon}\right) = \mathbb{P}\left(\frac{1}{n} \cdot \sum_{i=1}^nX_i - \mu \geq \varepsilon\right). $$

Related Question