I have a couple of proofs at http://www.se16.info/hgb/cheb.htm#OTProof and http://www.se16.info/hgb/cheb2.htm
One of these, loosely based on Probability and Random Processes by Grimmett and Stirzaker, would give a proof like this:
With $a>0$, for any $b\ge 0$ $$P(X\ge a) = P(X+b \ge a+b) \le E\left[\dfrac{(X+b)^2}{(a+b)^2}\right] = \dfrac{\sigma^2+b^2}{(a+b)^2}$$
But treating $\dfrac{\sigma^2+b^2}{(a+b)^2}$ as a function of $b$, the minimum occurs at $b = \sigma^2 / a$,
so $$P(X\ge a) \le \dfrac{\sigma^2+(\sigma^2/a)^2}{(a+\sigma^2/a)^2} =\dfrac{\sigma^2(a^2+\sigma^2)}{(a^2+\sigma^2)^2} = \dfrac{\sigma^2}{ \sigma^2+a^2}.$$
You're sort of on the right track, but your inequality in $1-P\left(|X-\frac{1}{6}n|\geq \sqrt{n}\right)\leq1-\frac{\operatorname{Var}{X}}{\epsilon^2}$ is the wrong way round ( it should be $1-P\left(|X-\frac{1}{6}n|\geq \sqrt{n}\right)\geq1-\frac{\operatorname{Var}{X}}{\epsilon^2}$ ), you haven't identified what $\epsilon$ is, and a power of 2 is missing from your integral for the variance. The latter should be $\int \left( X - \frac{1}{6} n \right)^2 dP$ .
I presume the form of Chebyshev's inequality you're using is $P(|X-\frac{1}{6}n|\geq \epsilon)\leq\frac{\operatorname{Var}{X}}{\epsilon^2}$ , in which case your $\epsilon$ is just $\sqrt{n}$ , and your inequality becomes $P(|X-\frac{1}{6}n|\geq \sqrt{n})\leq\frac{\operatorname{Var}{X}}{n}$
You could evaluate the integral for the variance by working out what the distribution $F_X$ of $X$ is (Hint: $F_X\left(j\right) = P\left(X=j\right)$ is the probability of getting $j$ sixes with $n$
independent throws of a fair die), but there's also a simpler way of calculating it.
If $X_i$ is the number of sixes you get on the $i^\mbox{th}$ throw, then $P\left( X_i = 1 \right) = \frac{1}{6}$ , $P\left( X_i = 0 \right) = \frac{5}{6}$ , and $X_1, X_2, \dots , X_n$ are independent, identically distributed random variables with $X = X_1 + X_2 + \dots + X_n$. Now there's a theorem which tells us that the variance of a sum of $n$ independent identically distributed random variables is just $n$ times the common variance of the summands. That is, $\mbox{Var}\left(X\right) = n \mbox{Var}\left(X_1\right)$ , so you can prove your result just by calculating the variance of the simple two-valued random variable $X_1$ .
Elaboration of hint about $F_X$:
Since $X$ can only take on one of the values $0, 1, \dots , n$ , the sample space (call it $\Omega$) can be partitioned into a union of the disjoint events $\ E_j = \left\{ \omega \in \Omega\ |\ X\left(\omega\right) = j\ \right\} \mbox{ for } j=0, 1, \dots , n $ . The integral $\int \left( X - \frac{1}{6} n \right)^2 dP$ can then be written as $\int_{\bigcup_{j=0}^n E_j}\left( X - \frac{1}{6} n \right)^2 dP = \sum_{j=0}^n \int_{E_j}\left( X - \frac{1}{6} n \right)^2 dP $ . Since $X$ has the fixed value $j$ everywhere in $E_j$ , then $\int_{E_j}\left( X - \frac{1}{6} n \right)^2 dP = $ $ \left( j- \frac{1}{6} n \right)^2 \int_{E_j} dP = \left( j- \frac{1}{6} n \right)^2 P\left(E_j\right) = \left( j- \frac{1}{6} n \right)^2 F_X\left(j\right)$ . So $Var\left(X\right) = \sum_{j=0}^n \left( j- \frac{1}{6} n \right)^2 F_X\left(j\right)$ .
As callculus noted in his answer, $X$ is $\ n, \frac{1}{6}$-binomially distributed, which gives you the expression for $F_X\left(j\right)$ as a function of $n$ and $j$ . If you don't know this expression, you will find it (as well as its variance!) in any good text on elementary probability theory (such as Volume 1 of William Feller's classic, An Introduction to Probability Theory and Its Applications—3rd Edition—where you will find the material on pp.147-8 and p.230) .
Best Answer
Well, we have $$\Pr(S \geq 380) \leq \Pr(S \geq 380) + \Pr(S \leq 320) = \Pr(|S - 350| \geq 30) \leq \operatorname{Var}(S) / 30^2 \approx 0.324$$ so this is probably the bound your book is looking for.