The difference is the sharper constants for the sub-Gaussian norm/parameter of the bounded variables. You can still use Azuma's technique to get the sharper bound. Here are some notes I wrote a while ago:
Sub-Gaussian tail bounds
Recall that $\mathbb{E} e^{\lambda X} \le e^{\sigma^2 \lambda^2/2 }$ implies the following tail bound ($\mathbb{E} X = 0$)
\begin{align*}
\mathbb{P}( X \ge t) \le \exp\Big( {- \frac{t^2}{2\sigma^2}}\Big)
\end{align*}
Such a random variable is called sub-Gaussian with parameter $\sigma$.
A bounded variable $X \in [a,b]$ has squared sub-G parameter $\sigma^2 = (b-a)^2/4$. This sharp bound requires some work to show (left as an exercise.)
Azuma-Hoeffding approach
We want to provide concentration bounds for $Z = f(X) = f(X_1,\dots,X_n)$.
Let $\mathbb{E}_i[Z] := \mathbb{E}[Z | X_1,\dots,X_i]$ and $\Delta_i = \mathbb{E}_i[Z] - \mathbb{E}_{i-1}[Z]$. Then, $\{\Delta_i\}$ is a martingale difference sequence: $\mathbb{E}_{i-1}[\Delta_i] = 0$. Let $S_j := \sum_{i=1}^j \Delta_i$ which is only a function of $X_i, i \le j$.
\begin{align*}
S_n = \sum_{i=1}^n \Delta_i = Z - \mathbb{E} Z
\end{align*}
Let us assume that (*) $\mathbb{E}_{i-1} [e^{\lambda \Delta_i}] \le e^{\sigma_i^2 \lambda^2/2}$ for all $\lambda \ge 0$, and all $i$, almost surely. Then,
\begin{align*}
\mathbb{E}_{n-1} [ e^{\lambda S_n}] = e^{\lambda S_{n-1}}\mathbb{E}_{n-1} [ e^{\lambda \Delta_n}] \le e^{\lambda S_{n-1}} e^{\sigma_n^2 \lambda^2/2}
\end{align*}
Taking $\mathbb{E}_{n-2}$ of both sides:
\begin{align*}
\mathbb{E}_{n-2} [ e^{\lambda S_n}] \le e^{\sigma_n^2 \lambda^2/2} \mathbb{E}_{n-2}[ e^{\lambda S_{n-1}} ] \le
e^{\lambda S_{n-2}} e^{(\sigma_n^2 + \sigma_{n-1}^2)\lambda^2/2}
\end{align*}
Repeating the process, we get $\mathbb{E}_{0} [e^{\lambda S_n}] \le \exp ( (\sum_{i=1}^n \sigma_i^2)\lambda^2/2 ) $. Showing that $S_n$ is sub-G with squared-param. $\sigma^2 = \sum_{i=1}^n \sigma_i^2$.
Bounded difference
Conditional sub-G assumption (*) holds under bounded difference
\begin{align*}
|f(x_1,\dots,x_{i-1},x_i,x_{i+1},\dots,x_n) - f(x_1,\dots,x_{i-1},x_i',x_{i+1},\dots,x_n)| \le L_i
\end{align*}
Let $g_i(x_1,\dots,x_i) := \mathbb{E}[f(x_1,\dots,x_{i-1},x_i,X_{i+1},\dots,X_n) ]$ so that $\mathbb{E}_i[Z] = g_i(X_1,\dots,X_i)$. This uses independence, i.e. the distribution of the rest does not change by conditioning. It is easy to see that $g_i$ satisfies bounded difference condition with constants $(L_1,\dots,L_i)$, by an application of Jensen.
Naive bound: We have
\begin{align*}
\Delta_i = \mathbb{E}_i[Z] - \mathbb{E}_{i-1}[\mathbb{E}_i[Z]] = g_i(X_1,\dots,X_i) - \mathbb{E}_{i-1}[g_i(X_1,\dots,X_i)]
\end{align*}
Conditioned on $X_1,\dots,X_{i-1}$, we are effectively looking at ($X'_i$ is an independent copy of $X_i$)
\begin{align*}
g_i(x_1,\dots,x_{i-1},X_i)- \mathbb{E}[g_i(x_1,\dots,x_{i-1},X'_i)]
\end{align*}
due to independence of $\{X_i\}$. Thus, $|\Delta_i| \le L_i$ condition on $X_1,\dots,X_{i-1}$. That is, $\mathbb{E}_{i-1} [e^{\lambda \Delta_i}] \le e^{\sigma_i^2 \lambda^2/2}$ where $\sigma_i^2 = (2L_i)^2/4 = L_i^2$.
Better bound: We can show that $\Delta_i \in I_i$ where $|I_i| \le L_i$, improving the constant by $4$: Conditioned on $X_1,\dots, X_i$, we are effectively looking at
\begin{align*}
\Delta_i = g_i(x_1,\dots,x_{i-1},X_i) - \mu_i
\end{align*}
where $\mu_i$ is a constant (only a function of $x_1,\dots,x_{i-1}$). Then, $\Delta_i + \mu_i \in [a_i,b_i]$ where $a_i = \inf_x g_i(x_1,\dots,x_{i-1},x)$ and $b_i = \sup_x g_i(x_1,\dots,x_{i-1},x)$. We have
\begin{align*}
b_i - a_i = \sup_{x,y} \big[ g_i(x_1,\dots,x_{i-1},x) - g_i(x_1,\dots,x_{i-1},y) \big] \le L_i
\end{align*}
Thus, we have $\mathbb{E}_{i-1}[e^{\lambda \Delta_i}] \le e^{\sigma^2 \lambda^2/2}$ where $\sigma_i^2 = (b_i-a_i)^2/4 \le L_i^2/4$.
Wikipedia cites a different version of Azuma's inequality:
Suppose $(X_n)_n$ is a martingale and $|X_k - X_{k-1}| \le c_k$ almost surely. Then for all $N \in \mathbb{N}$ and all $\varepsilon > 0$,
$$P(|X_N - X_0| \ge \varepsilon) \le 2 \exp \left( \frac{-\varepsilon^2}{2 \sum_{k=1}^N c_k^2 }\right).$$
Are you sure you don't have access to a result like that? Anyway I'll show how to move forward assuming Wikipedia's version and you can contemplate whether it helps you find a way forward given what you're allowed to assume.
In this particular martingale, suppose $X_{k-1} = \frac{r}{k+1}$ for some $1 \le r \le k$. Then $X_{k}$ could be $\frac{r+1}{k+2}$ or $\frac{r}{k+2}$ depending on what color ball we pull on turn $k$. Thus we have
$$\begin{align}
|X_{k} - X_k-1| &\le \max\left \{ \frac{r+1}{k+2} - \frac{r}{k+1}, \frac{r}{k+1} - \frac{r}{k+2} \right\} \\
&= \max \left \{ \frac{k+1 - r}{(k+1)(k+2)}, \frac{r}{(k+1)(k+2)} \right \} \\
&\le \frac{k+1}{(k+1)(k+2)} \\
&= \frac{1}{k+2}.
\end{align}$$
Thus we can choose $c_k = \frac{1}{k+2}$, which gives
$$\begin{align}
\sum_{k=1}^\infty c_k^2 &= \sum_{j=3}^\infty \frac{1}{j^2} = \zeta(2) - 1 - \frac 1 4 = \frac{\pi^2}{6} - \frac{5}{4}
\end{align}$$
and so, for any $N$, we have the bound
$$\begin{align}
P\left(\left|X_N - \frac 1 2\right| \ge \varepsilon \right)
&\le 2 \exp \left( \frac{-\varepsilon^2}{2 \sum_{k=1}^N c_k^2 }\right) \\
&= 2 \exp \left( \frac{-\varepsilon^2}{\frac{\pi^2}{3} - \frac{5}{2} }\right) \\
&= 2 \exp \left( \frac{-6 \varepsilon^2}{2 \pi^2 - 15 }\right).
\end{align}$$
The final bound works out so neatly that I kinda suspect you're meant to have access to something more like the result I'm quoting. But anyway I'll leave the problem of fully understanding what assumptions are legal up to you.
Best Answer
\begin{align*} \mathbb E[e^{t S_n}] &= \mathbb E[\mathbb E[e^{t S_n} \mid X_1, \dots, X_{n-1}]] & \text{(tower property of conditional expectation)} \\ &= \mathbb E[\mathbb E[e^{t S_{n-1}}e^{t V_n} \mid X_1, \dots, X_{n-1}]] & \text{($S_n = S_{n-1} + V_n$)} \\ &= \mathbb E[e^{tS_{n-1}} \mathbb E[e^{t V_n} \mid X_1, \dots, X_{n-1}]] & \text{($S_{n-1}$ is measurable w.r.t. $\sigma(X_1, \dots, X_{n-1})$)} \end{align*}