In my edition of the book it reads
$$\lim_{n \to \infty} \sup_{\color{red}{u \leq u_0} } \left| \frac{Y(nu)}{n}-u \right|=0 \quad \text{a.s.}$$
So let's prove this. Fix $\varepsilon>0$. For any $n \in \mathbb{N}$ we have by Etemadi's inequality
$$p_n := \mathbb{P} \left( \sup_{u \leq u_0} \left| \frac{Y(nu)}{n}-u\right| > 3\varepsilon \right) \leq 3\sup_{u \leq u_0} \mathbb{P} \left( \left| \frac{Y(nu)}{n}-u\right|>\varepsilon \right). \tag{1}$$
The idea is to show that $$\sum_{n \in \mathbb{N}} p_n<\infty; \tag{2}$$ the claim then follows from the Borel-Cantelli lemma. In order to prove $(2)$ we note that we can choose a constant $C>0$ such that for any $|\lambda| \leq 1$
$$\mathbb{E}e^{\lambda \tilde{Y}_t} \leq e^{Ct \lambda^2}, \tag{3}$$
where $\tilde{Y}_t :=Y_t-t$ denotes the compensated Poisson process; see the lemma below. The (exponential) Markov inequality and $(1)$ then shows
$$\begin{align*} p_n &\leq 3\sup_{u \leq u_0} \mathbb{P} \bigg( Y(nu)-nu>\varepsilon n\bigg)+3\sup_{u \leq u_0} \mathbb{P} \bigg( -(Y(nu)-nu)>\varepsilon n \bigg) \\ &\leq 3 \sup_{u \leq u_0} \bigg[ \exp \left(\lambda \tilde{Y}(nu)-\varepsilon n \lambda\right)+ \exp \left(-\lambda \tilde{Y}(nu)-\varepsilon n \lambda\right) \bigg]. \end{align*}$$
If we choose $\lambda=\frac{1}{\sqrt{n}}$ and apply $(2)$, then we get
$$p_n \leq 6 \exp \left( C u_0-\varepsilon \sqrt{n} \right).$$
Obviously, this entails $(2)$.
Lemma Let $(Y_t)_{t \geq 0}$ be a Poisson process (with rate $1$) and $\tilde{Y}:=Y_t-t$ the compensated Poisson process. Then $(3)$ holds.
Proof: Since $Y_t \sim \text{Poi}(t)$, the exponential moments can be calculated explicitely: $$\mathbb{E}e^{\lambda Y_t} = e^{t \cdot (e^{\lambda}-1)}.$$ Hence, $$\mathbb{E}e^{\lambda \tilde{Y}_t} = e^{t \cdot (e^{\lambda}-1-\lambda)}.$$ For $\lambda \in [-1,1]$, we have $$|e^{\lambda}-1-\lambda| \leq C \cdot \lambda^2$$ and this proves $(3)$.
Remark The claim holds for any Lévy process $(Y_t)_{t \geq 0}$ with finite exponential moments:
$$\lim_{n \to \infty} \sup_{u \leq u_0} \left| \frac{Y(nu)}{n}-\mathbb{E}Y_1 \cdot u \right|=0 \quad \text{a.s.}$$
To show convergence in probability,
$$ \mathbb E[X_n] = \frac{n+1}{2(n+1)\log(n+1)} - \frac{n+1}{2(n+1)\log(n+1)} = 0$$
and $$
\mathrm{Var}(X_n)=\mathbb E[X_n^2]
= \frac{2(n+1)^2}{2(n+1)\log(n+1)} = \frac{n+1}{\log(n+1)}.
$$
Hence $\mathbb E[S_n]=0$, and
$$\begin{align*}
\frac1{\varepsilon^2}\mathbb E\left[\left(\frac{S_n}n\right)^2\right] &= \frac1{n^2\varepsilon^2} \mathbb E[S_n^2]\\
&= \frac1{n^2\varepsilon^2} \mathrm{Var}\left(\sum_{i=1}^n X_i\right)\\
&= \frac1{n^2\varepsilon^2} \sum_{i=1}^n \mathrm{Var}(X_i)\\
&= \frac1{n^2\varepsilon^2} \sum_{i=1}^n \frac{i+1}{\log(i+1)}\\
&\leqslant \frac1{n^2\varepsilon^2}\left(\frac{n(n+1)}{\log(n+1)}\right)\\
&= \frac{n+1}{n\log(n+1)\varepsilon^2}\\
&= \frac1{\log(n+1)\varepsilon^2} + \frac1{n\log(n+1)\varepsilon^2}\stackrel{n\to\infty}{\longrightarrow}0.
\end{align*}$$
By Markov's inequality,
$$ \mathbb P\left(\frac{S_n}n \geqslant\varepsilon \right)\leqslant \frac{\mathbb E\left[\left(\frac{S_n}n\right)^2 \right]}{\varepsilon^2}\stackrel{n\to\infty}{\longrightarrow}0.$$
To show that the convergence is not almost sure, for each $n$ we have, as pointed out by @Frank
$$ \{X_n=n+1\} \subset \left\{|S_n|\geqslant \frac n2\right\} \cup \left\{|S_{n-1}|\geqslant \frac n2\right\}.$$
Since $$\sum_{n=1}^\infty \mathbb P(X_n=n+1) = \sum_{n=1}^\infty\frac1{2(n+1)\log(n+1)}=+\infty,$$
by Borel-Cantelli we have $$\limsup_{n\to\infty} \mathbb P\left(\frac{|S_n|}n\geqslant \frac12\right)=1,$$
and hence $$\mathbb P\left(\lim_{n\to\infty} \frac{S_n}n = 0\right)<1.$$
Best Answer
$\log x$ is a concave function. Jensen's inequality shows that $E \log Y_k \leq \log EY_k=0$.