Yes, you can apply Hoeffding's lemma directly to $X:=S_n-E(S_n)$. Since $S_n$ has mean zero and $\sum a_i\le S_n\le\sum b_i$, Hoeffding's lemma gives:
$$
E(\exp\big(s[S_n-E(S_n)]\big)\le \exp\big(\frac18 s^2\left[\sum \left(b_i-a_i\right)\right]^2\big).\tag1
$$
But note the subtle difference between (1) and what you get when you apply Hoeffding to each $X_i$:
$$
E(\exp\big(s[S_n-E(S_n)]\big)=\prod_{i=1}^n E(\exp\big(s[X_i-E(X_i)]\big)\le \exp\big(\frac18 s^2\sum \left(b_i-a_i\right)^2\big).\tag2
$$
That is, (1) has the quantity $(\sum c_i)^2$ where (2) has $\sum c_i^2$. Since the $c$'s are all nonnegative, we have $\sum c_i^2\le (\sum c_i)^2$ so version (2) gives the sharper upper bound.
This makes a difference, as you can see in the special case where the $a$'s are all equal and the $b$'s are all equal: Compared to $nc^2$ in version (2), version (1) has $n^2c^2$. If you follow this through to the end of the derivation of Hoeffding's inequality, $n^2c^2$ is coarse enough to wipe out the power of the resulting inequality.
It is not necessary to use any kind of bounds. The average number of fish you catch per day is $10$, so $\lambda=10$. The distribution of the number of salmon you catch per day is then $S\sim\text{Poisson}(10\times .3)=\text{Poisson}(3)$. The probability that $S\ge5=1-(.0498+.1494+.2240+.2240+.1680)=.1848$.
For part a, we could try $P(S=s \&T=t)$ (standing for salmon and trout caught) equals $\sum_{f=0}^\infty P(S=s\& T=t|F=f)P(F=f)=\sum_{f=0}^\infty P(S=s|T=t,F=f)P(T=t|F=f)P(F=f)$
Trying $P(S=0,T=0)=^? P(S=0)P(T=0)$ we get the rhs to be $e^{-.3\lambda}e^{-.4\lambda}=e^{-.7\lambda}$.
The lhs is found by the total law of probability to be $\sum_{f=0}^\infty P(S=0|T=0,F=f)P(T=0|F=f)P(F=f)=\sum_{f=0}^\infty (.5)^f(.6)^f\frac{e^{-\lambda}\lambda^f}{f!}=e^{-\lambda}\sum_{f=0}^\infty \frac{(.3\lambda)^f}{f!}$
Using the identity $e^\lambda=\sum_{x=0}^\infty \frac{\lambda^x}{x!}$, we get the preceding expression to be $e^{-\lambda}e^{.3\lambda}=e^{-.7\lambda}$ which, surprisingly or not, matches the marginals multiplied together. Now I dunno if this will hold if $P(S=1)P(T=1)=^?P(S=1,T=1)$. Edit: Holds as well. Now would hazard to surmise that holds for all fishes and all quantities; conclude independence.
Proof for salmon and trout (get ready...):
$$\begin{split}P(S=s \&T=t)&=\sum_{f=s+t}^\infty P(S=s\& T=t|F=f)P(F=f)\\
&=\sum_{f=s+t}^\infty P(S=s|T=t,F=f)P(T=t|F=f)P(F=f)\\
&=\sum_{f=s+t}^\infty \frac{e^{-\lambda}\lambda ^f}{f!}{f\choose t}.4^t.6^{f-t}{f-t\choose s}.5^{f-t}\\
&=e^{-\lambda}\sum_{f=s+t}^\infty \frac{\lambda ^ff!t!(f-t)!s!(f-t-s)!}{f!.4^t.3^{f-t}(f-t)!}\\
&=\frac{(.3\lambda)^s\lambda ^t e^{-\lambda}.4^t}{t!s!}\sum_{f=s+t}^\infty \frac{(.3\lambda)^{f-t-s}}{(f-t-s)!}\\
&=\frac{(.3\lambda)^s\lambda ^t e^{-\lambda}.4^t}{t!s!}\sum_{x=0}^\infty \frac{(.3\lambda)^{x}}{x!}\\
&=\frac{(.3\lambda)^s}{s!}e^{-.3\lambda}\frac{e^{-.4\lambda}(.4\lambda)^t}{t!}=P(S=s)P(T=t)\end{split}$$
The Markov bound is $P(S\ge5)\le\frac{E(S)}{5}=\frac 35=.6$, which is true as we found above.
The Chernoff bound is $P(S\ge 5)\le\min_{s>0}e^{-5s}e^{3(e^s-1)}$. Since $\log$ is increasing this is the same as minimizing the log of it, $-5s+3e^s-3$. The derivative wrt $s$ is $-5+3e^s=0$, with solution at $s=\log {\frac 53}$. The second derivative is positive, indicating a minimum. This gives the bound $e^{-5\log{\frac 53}+5-3}=0.574573$.
Best Answer
Long story short: it is not enough to look only at the dependency on $t$.
I consider the following setting: $X_1,\ldots,X_N$ are independent random variables (not necessarily identically distributed) taking values in $[0,1]$. You can check that with a change of variable, your version of Chernoff's inequality is equivalent to
$$P(S_N-\mu > t)\le \exp\left(t-(\mu+t)\log\left(1+\frac{t}{\mu}\right)\right).$$
Plugging in the inequality $(1+x)\log(1+x)-x\ge \frac{x^2}{2+2x/3}$ for $x\ge 0$, it implies that
$$P(S_N-\mu > t)\le \exp\left(-\frac{t^2}{2\mu+\frac{2}{3}t}\right).$$
On the other hand, Hoeffding's inequality says that
$$P(S_N-\mu > t)\le \exp\left(-\frac{2t^2}{n}\right).$$
Now note that $\mu\le n$ and we are only interested in $0\le t\le n$ (which implies that $t\ge t^2/n$). Therefore, as long as you do not care about the constants inside the exponential, Chernoff's inequality is always at least as good as Hoeffding's. Furthermore, it can be much better in the regime $\mu\ll n$.