Notice that since $y_{i} = \pm 1$, you can rewrite,
$$
\alpha_i = \frac{1}{y_i} \left[ \frac{1}{y_i} - \frac{1}{N} \sum_{n=1}^N \frac{1}{y_n}\right] = y_i \left[y_i - \frac{1}{N} \sum_{n=1}^N y_n\right] = 1 - y_{i}\frac{N^{+}-N^{-}}{N}
$$
where $N^{+}$ and $N^{-}$ are the number of samples in each of the classes. You can check that $\sum_{n}\alpha_{n}y_{n} = 0$. Also $\alpha_{n} > 0$, that is, all vectors are support vectors.
As for the margin,
$$
||\omega|| = \sum_{n}\alpha^{2} = N\left[1-\left(\frac{N^{+}-N^{-}}{N}\right)^{2}\right]
$$
First, let's calculate the norm $||w||^2$.
$$||w||^2 = \sum_i \alpha_iy_i\big(\sum_j\alpha_jy_j\langle x_i,x_j\rangle\big)$$
which evidently can be rearranged to $\sum_i\sum_j\alpha_i\alpha_jy_iy_j\langle x_i,x_j\rangle$.
The $\langle x_i, x_j\rangle$ construct is present because it's assumed that the norm is defined in terms of the inner product - every inner product induces a norm by the formula $||z||^2 = \langle z,z \rangle$ - so when we calculate $||w||^2$ (making the desired substitution from above) we use $\langle x_i,x_j \rangle$. The reason we don't get something like:
$$\sum_i \sum_j \langle \alpha_i y_i x_i, \alpha_j y_j x_j \rangle$$
is because the inner product is defined on $x$, and everything else is just a scalar multiplier, so, by basic properties of inner products, can get moved outside of the $\langle \rangle$.
Now, substituting into $\sum_i\alpha_i[y_i(\langle w, x_i\rangle+b)-1]$ can be done in parts:
$$\sum_i\alpha_i[y_i(\langle w, x_i\rangle+b)-1] = \sum_i\alpha_iy_i\langle w, x_i\rangle + b\sum_i\alpha_iy_i - \sum_i\alpha_i$$
The last term on the r.h.s. evidently equals $-\sum_ia_i$, and the middle term equals $0$, as the second constraint is that $\sum_i\alpha_iy_i = 0$. Substituting in for the first term gives:
$$\sum_i\alpha_iy_i\langle w, x_i\rangle =\sum_i\alpha_iy_i\langle \sum_j\alpha_jy_jx_j, x_i\rangle = \sum_i\sum_j\alpha_iy_i\alpha_jy_j\langle x_i, x_j \rangle$$
where the step to the last term is by basic properties of inner products.
Having gotten this far, we need to (remember to) a) multiply $||w||^2$ by $1/2$, b) multiply the long second term by $-1$, and c) combine them:
$${1\over 2}\sum_i \sum_j \alpha_iy_i\alpha_jy_j\langle x_i,x_j\rangle - \sum_i \sum_j \alpha_iy_i\alpha_jy_j\langle x_i,x_j\rangle - 0 + \sum_i\alpha_i $$
which evidently reduces to the desired result
$$-{1\over 2}\sum_i \sum_j \alpha_iy_i\alpha_jy_j\langle x_i,x_j\rangle + \sum_i\alpha_i $$
Best Answer
Tom Minka gives the derivation in this excellent paper "A comparison of numerical optimizers for logistic regression"
pdf, section 9