There are several common notions of tightness of bounds, below is perhaps the simplest one.
Denote the Chernoff bound as $B(x) \equiv \frac{ \lambda }{ \lambda - r} e^{- rx}$ for the exponential function, which tail probability (complement CDF) is $P(X > x) = 1 - F_X(x) = e^{-\lambda x}$.
The error $B(x) - e^{-\lambda x}$ is readily shown to be always positive over $x \in \mathbb{R}^{+}$ for all $0 < r < \lambda$.
Consider the total error over $\mathbb{R}^{+}$, which is an routine integral:
\begin{align}
\Delta(r) \equiv \int_{ x = 0}^{\infty} \left( \frac{ \lambda }{ \lambda - r} e^{- r x} - e^{-\lambda x} \right)\,\mathrm{d}x = \frac{ \lambda}{ r\, (\lambda - r)} - \frac1{\lambda}
\end{align}
Note that the relative error $(B(x) - e^{-\lambda x})/e^{-\lambda x}$ doesn't converge as is. You may consider various transformation to make it work if the notion of relative error is really important to you.
Moving on. One can already guess that the minimum of $\Delta(r)$ occurs at $r_0 = \lambda / 2$ in the middle because the function is mirror symmetric (exchange $r \longleftrightarrow \lambda-r$). Anyway, checking that with the first and second derivative of $\Delta(r)$ is routine.
As $r$ varies, the curve $B(x)$ may get closer or further to $P(X > x)$ point-wise at any given $x$, depending on the relation between $r$ and $\lambda$. As a whole, one may consider the $r = r_0 = \lambda / 2$, which minimizes $\Delta(r)$ as defined above, as producing the "tightest" bound.
Sometimes, depending on the applications, one might care only about a specific interval of $a < x < b$. One then optimizes $r$ by minimizing the corresponding error over that interval only. It will be a different $r_0$ for different situation.
BTW, here the error over $\mathbb{R}^{+}$ is minimized to $\min \{\Delta(r)\} = \Delta(\frac{\lambda}2) = 3/\lambda$.
If your so-called Markov bound refers to the common $P( X > x) < \frac{ E[X] }x$, then it is just $B_M(x) = \frac1{\lambda x}$.
Note that the Markov inequality is very loose and the corresponding error doesn't converge, as in it's basically $\int \frac1{x} dx$ that blows up.
The Markov bound as such is a simple function, and one can certainly compare the functional forms of the Chernoff bound $B(x)$ with $B_M(x)$, either in general, or for given $\lambda = 1$ and discuss the numerical details at specific points of $x$.
I'm afraid not much (or too much) can be said without further context.
Best Answer
We observe that $$M_S(t) = \operatorname{E}[e^{tS}] = \operatorname{E}[\operatorname{E}[e^{tS} \mid N]],$$ where the outer expectation is taken with respect to $N$ and the inner with respect to $S$ conditioned on $N$. Since $$e^{tS} \mid N = \prod_{i=1}^N e^{t X_i},$$ we have $$\operatorname{E}[e^{tS} \mid N] \overset{\text{ind}}{=} \prod_{i=1}^N \operatorname{E}[e^{tX_i}] = \left(M_X(t)\right)^N = \left(\frac{\lambda}{\lambda-t}\right)^N,$$ where $X$ is an exponential random variable parametrized by rate $\lambda$. Consequently, $$M_S(t) = \operatorname{E}\left[e^{N \log (\lambda/(\lambda - t))}\right] = M_N\left(\log \frac{\lambda}{\lambda - t}\right),$$ where $N \sim \operatorname{Geometric}(p)$ with parametrization $$\Pr[N = n] = (1-p)^{n-1} p, \quad n = 1, 2, \ldots.$$ We require $N$ to have support beginning at $1$ since it is not possible to correctly answer a question in $N = 0$ responses. This parametrization has MGF $$M_N(t) = \frac{pe^t}{1 - (1-p)e^t},$$ from which we obtain $$M_S(t) = \frac{\frac{p\lambda}{\lambda - t}}{1 - \frac{(1-p)\lambda}{\lambda - t}} = \frac{p\lambda}{p\lambda - t},$$ which is the MGF of an exponential distribution with rate $p\lambda$, as desired.