Using the Chernoff bound you suggested for some $s\le 1/(2\sigma^2)$ that will be specified later,
\[
P[X>t] \le \exp(-st) \exp\Big(-(N/2) \log(1-\sigma^4s^2) \Big) \le \exp(-st + \sigma^4s^2 N)
\]
where the second inequality holds thanks to $-\log(1-x)\le 2x$ for any $x\in(0,1/2)$. Now take $t=\epsilon \sigma^2 N$ and $s=t/(2\sigma^4N)$, the right hand side becomes $\exp(-t^2/(4\sigma^4N)=\exp(-\epsilon^2 N/4)$ which yields
\[
P[X>\epsilon \sigma^2 N] \le \exp(-\epsilon^2 N/4).
\]
for any $\epsilon\in(0,1)$.
Another avenue is to directly apply concentration inequalities such as the Hanson-Wright inequality, or concentration inequalities for Gaussian chaos of order 2 which encompasses the random variable you are interested in.
Simpler approach without using the moment generating function
Take $\sigma=1$ for simplicity (otherwise, one may rescale by dividing by $\sigma^2$).
Write $v=(v_1,...,v_n)^T$ and $w=(w_1,...,w_n)^T$. You are asking for upper bounds
on $P(v^Tw>\epsilon N)$.
Let $Z= w^T v/\|v\|$. Then $Z\sim N(0,1)$ by independence of $v,w$
and $\|v\|^2$ is independent of $Z$ with the $\chi^2$ distribution with $n$ degrees-of-freedom.
By standard bounds on standard normal and $\chi^2$ random variables,
$$P(|Z|>\epsilon\sqrt{n/2})\le 2\exp(-\epsilon^2 n/4),
\qquad\qquad
P(\|v\|>\sqrt{2n}) \le \exp(-n(\sqrt 2 -1)^2/2).
$$
Combining with the union bound gives an upper bound on $P(v^Tw>\epsilon N)$
of the form $ 2\exp(-\epsilon^2 n/4) + \exp(-n(\sqrt 2 -1)^2/2)$.
Best Answer
A trivial example would be if $X_i$ is deterministic (say always equal to 0). The right hand side would then be the dirac mass at 0 (as seen in the proof of Hoeffding's inequality).
There can't be any other example as that would contradict the hypothesis that $\bar{X}$ is bounded, since
$$ 0 < C \exp\left( -\frac{2n^2 t^2}{\sum_{i=1}^n (b_i-a_i)}\right) \leq P( \bar{X} \geq E[\bar{X}] + t) \qquad \forall t \geq 0 $$