Derivation of alternative form of Chernoff bound

probabilityrandom variablesupper-lower-bounds

I have encountered an alternative form for the Chernoff bound for the sum of $n$ coins which I have not been able to derive.
Specifically, let $X_1,…,X_n$ be independent Poisson trials, let $X = \sum_{i=1}^{n} X_i$ and let $\mu = \mathbb{E}(X)$. Then
$$
\forall t > 0 .
\mathbb{P}(X \geq \mu + t) \leq \exp\left(-2\frac{t^2}{n}\right)
$$

I am familiar with how to arrive at the more common variant of this Chernoff bound, that is
$$
\forall \delta > 0. \mathbb{P}(X \geq (1+\delta)\mu) \leq \left(\frac{e^\delta}{(1+\delta)^{(1+\delta)}}\right)^\mu,
$$

but I have not been able to derive the former from it. Any help would be greatly appreciated.

EDIT: parsiad's answer makes use of Hoeffding's inequality, which was introduced in the lecture notes I was reading well after this variant of the bound was presented. So if there is a way to derive this bound without using Hoeffding's inequality/lemma, I would be grateful to see it.

UPDATE:
It is possible to show that the bound on the probability as given by the formula I was looking to derive is actually tighter than the other bound. Hence I don't think it is possible to derive it without Hoeffding's inequality.

Best Answer

For each $n$, let $X_{n}$ be a random variable bounded between $a_{n}$ and $b_{n}$. Let $X\equiv X_{1}+\cdots+X_{n}$ and $\mu \equiv \mathbb{E}X$. Hoeffding's inequality states that $$ \mathbb{P}(X \geq \mu + t)\leq\exp\left(-\frac{2t^{2}}{\sum_{i=1}^{n}\left(b_{i}-a_{i}\right)^{2}}\right). $$ In your case, $b_{i}=1$ and $a_{i}=0$ and hence the right hand side above becomes $\exp(-2t^{2}/n)$, as desired.

A proof of Hoeffding's inequality is available on the Wikipedia page. There is also a good one in the Appendix of Chapter 4 of All of Statistics by Wasserman.