Yes it is possible to translate this bound to your particular problem:
The expected value of your binomial random variables is $\mu=\frac{n}{2}$ not $0$, and their standard deviation is $\sigma=\sqrt{\frac{n}{4}}$ so the corresponding upper bound for the maximum is $$ \mathbb{E}[Z] \leq \frac{n}{2}+ \sqrt{\frac{n}{2} \log_e n} .$$ This uses the Gaussian approximation to the binomial, which happens faster than the upper bound being approached.
You cannot get rid of the $\sqrt{\log n}$ term. As an illustration of how good this bound is, if $n=10^6$ then the upper bound would suggest about $502628.3$. In fact the expectation of the maximum value is about $502431.4$ which is about $\frac{n}{2}+ \sqrt{\frac{n}{2.337} \log_e n} $.
See that:
$$
\mathbb P[X_k \leq (1-\epsilon) n]^k=(1-
\mathbb P[X_k > (1-\epsilon) n])^k\leq \exp(-k\mathbb P[X_k > (1-\epsilon) n]).
$$
But then:
$$
\mathbb P[X_k > (1-\epsilon) n]=\sum_{i=\lceil (1-\epsilon) n \rceil}^n\binom{n}{i}=\sum_{i=0}^{\lceil \epsilon n \rceil}\binom{n}{i}\geq \sum_{i=0}^{\lceil \epsilon n \rceil}(\frac {n}{{\lceil \epsilon n \rceil}})^i,
$$
where the last inequality follows from $\binom{n}{m}\geq (\frac nm)^m$ and using $i\leq{\lceil \epsilon n \rceil}$. Define:
$$
f(n,\epsilon)=\sum_{i=0}^{\lceil \epsilon n \rceil}(\frac {n}{{\lceil \epsilon n \rceil}})^i.
$$
It is enough to find $k$ such that:
$$
\exp(-kf(n,\epsilon))\leq \frac 1{\sqrt n}.
$$
which is:
$$
k\geq \frac{1\log n}{2f(n,\epsilon)}.
$$
Therefore if $k$ is bigger than $\displaystyle\frac{\log n}{2c^n}$ for $c=(\frac 1\epsilon)^\epsilon$, then the inequality is satisfied where the following inequality is used for such derivation:
$$
{f(n,\epsilon)}=\sum_{i=0}^{\lceil \epsilon n \rceil}(\frac {n}{{\lceil \epsilon n \rceil}})^i\geq (\frac 1{\epsilon})^{n\epsilon}.
$$
Let's explore another approach to the problem.
The problem involves approximation of first $m=n(1-\epsilon)$ binomial coefficients. There are nice bounds if $m\leq \frac n2$, i.e., if $\epsilon>\frac 12$. You can look at here for some of the existing bounds where $m\frac 12$ appears mostly as a condition for those bounds.
Intuitively according to law of large numbers, asymptotically $\frac{X_i}{n}$ is a.e. $\frac 12$ for all $i$ and so is the maximum. Therefore for $\epsilon<\frac 12$, the probability goes to zero, which is when $n\to\infty$ first.
Below I give one proof for the case $\epsilon>\frac 12$.
First suppose that $\epsilon>\frac 12$.
Suppose that $Y_i$'s are 0-1 Bernoulli random variables with $p=\frac 12$. Then the binomial random variable $B(n,\frac 12)$ can be written as $\sum _{i=1}^n Y_i$. Define $Z_i=Y_i-\frac 12$ (zero-mean Bernoulli RV taking $+\frac 12,-\frac 12$ as values). We have:
$$
\mathbb P[X_k \leq (1-\epsilon) n]=\mathbb P[\sum_{i=1}^n Y_i\leq (1-\epsilon)n]= \mathbb P\big[\sum_{i=1}^n Z_i\leq (\frac 12-\epsilon)n\big].
$$
Using Hoeffding inequality, one can get:
$$
\mathbb P\big[\sum_{i=1}^n Z_i\leq (\frac 12-\epsilon)n\big]\leq \exp(-2(\frac 12-\epsilon)^2n),
$$
which shows:
$$
\mathbb P[\max_i X_i > (1-\epsilon) n]\geq 1-\exp(-2k(\frac 12-\epsilon)^2n).
$$
For $k\geq \frac 1{4(\frac 12-\epsilon)^2}\frac{\log n}{n}$, we have:
$$
\mathbb P[\max_i X_i > (1-\epsilon) n]\geq 1-\frac 1{\sqrt n}.
$$
Best Answer
$\newcommand\P{\mathbb{P}}\newcommand\E{\mathbb{E}}\newcommand\ol{\overline}$Write $\ol X_{\max} = \max\{\ol X_i\} = \max\{\tfrac{1}{n} X_i\}$. We can compute $$ \P(\ol X_{\max} > p + t) = \P(\ol X_i > p + t \text{ for some } i=1,\ldots,k) \leq k\P(\ol X_1 > p + t) \leq k\,e^{-2nt^2} $$ by the union bound and Hoeffding's inequality. Denoting $(x)_+ = \max\{x,0\}$, this implies $$ \E \ol X_{\max} \leq p + \E(\ol X_{\max} - p)_+ = p + \int_0^\infty \P(X_{\max} > p + t)dt \leq p + k\sqrt{\frac{\pi}{8n}} $$ In terms of the original variables $X_{\max} = n\ol X_{\max}$, we have $$ \E X_{\max} \leq np + k\sqrt{\frac{n\pi}{8}} \leq np + \tfrac{2}{3}k\sqrt{n}. $$ Note that this is much weaker in terms of $k$ than the asymptotically correct bound $$ \E X_{\max} \asymp np + \sqrt{2p(1-p)} \sqrt{n\log k} $$ based on a normal approximation and the fact that if $Z_i\sim\mathcal{N}(0,1)$, $\E Z_{\max} \asymp \sqrt{2\log k}$, but it does give the right dependence on $n$.