I think that your intuition is correct, but the question is a bit subtle. FIrst of all, there are topological obstructions for the group of biholomorphisms to be a complex Lie group, as pointed out in the comment by Andrew Hwang. One simple case in which such obstructions vanish (and to be honest, the only one I ever studied in any detail) is when the manifold is compact.
So, assume for the moment we have a compact manifold $X$, together with a complex structure $J$. I will denote by $\operatorname{Aut}(X,J)$ the group of biholomorphisms. A $1$-parameter subgroup, say $\{f_t\mid t\in\mathbb{R}\}\subset\operatorname{Aut}(X,J)$ gives you a vector field $v$ on $X$ by setting $$v_p:=\partial_{t=0}f_t(p).$$ Notice that this is just a real vector field, i.e. a section of $TX\to X$. As $f_t^*J=J$, we see however that $v$ satisfies \begin{equation}\label{realhol}
\tag{$\star$}
\mathcal{L}_vJ=0
\end{equation}
and for any $v$ that satisfies \eqref{realhol}, its flow $f_t$ will be a subgroup of $\operatorname{Aut}(X,J)$ - the flow will be complete, as $X$ is compact. I guess we can already see that this might be a problem for noncompact manifolds.
Sections of $TX$ that satisfy \eqref{realhol} are usually called real-holomorphic vector fields. From this point of view then, the lie algebra of $\operatorname{Aut}(X,J)$ is the algebra of real-holomorphic vector fields. Notice however that:
if $v$ is real-holomorphic, also $Jv$ is;
$v$ is real-holomorphic if and only if $v^{1,0}$ is holomorphic, meaning that it is a holomorphic section of the holomorphic vector bundle $T^{1,0}X$.
So we can identify the Lie algebra of $\operatorname{Aut}(X,J)$ either with holomorphic or real-holomorphic vector fields, and it has a complex structure coming either from remark $1$ above, if you like to use real-holomorphic vector fields, or just from multiplication by $i$, if you prefer holomorphic vector fields. However, we are still missing an useful tool, as one might wonder what the complex analogue of the flow of a real-holomorphic vector field is.
So, let $v^{1,0}$ be a holomorphic vector field, and consider the real-holomorphic vector fields $v$ and $Jv$. notice that
\begin{equation}
v=2\,\Re(v^{1,0})$\mbox{ and }Jv=-2\,\Im(v^{1,0})
\end{equation}
and notice also that by \eqref{realhol}, $[v,Jv]=0$. So the flows of $v$ and $Jv$ commute; call them $f^v_t$ and $f^{Jv}_s$, and consider the composition
\begin{equation}
\Phi_{z}:=f^v_t\circ f^{Jv}_s,\ \mbox{ for }z=t+\operatorname{i}s
\end{equation}
Then $\Phi_z$ is a subgroup of $\operatorname{Aut}(X,J)$, and it can be checked that for any $p\in M$, $$v^{1,0}_p=\partial_{z=0}\Phi_z(p).$$
In other words, $\Phi_z$ is the "complex flow" of the holomorphic vector field $v^{1,0}$.
Best Answer
I'll write $\mathcal{L}_X$ for the Lie derivative with respect to $X$. We will prove that, for any $k$-form $\eta$ on $M$, $$ \frac{d}{dt} f_t^*\eta = f_t^*\mathcal{L}_{X_t}\eta. $$ In particular, it holds for the symplectic $2$-form $\omega$, which is your question. To do this, we will prove that
Then, since the algebra of differential forms on $M$ is (locally) generated by sums of wedge products of functions and exact $1$-forms, the result holds for an arbitrary $\eta$.
$$ \left.\frac{d}{dt}\right|_{t=t_0} f_t^*\eta = \left.\frac{d}{dt}\right|_{t=t_0}\eta \circ f_t = (X_{t_0}\eta) \circ f_{t_0} = f_{t_0}^*(\mathcal{L}_{X_{t_0}}\eta), $$
where in the second equality we used $X_{t_0} \circ f_{t_0} = \left.\frac{d}{dt}\right|_{t=t_0}f_t$.
$$ \left.\frac{d}{dt}\right|_{t=t_0}f_t^*(d\eta) = \left.\frac{d}{dt}\right|_{t=t_0}d(f_t^*\eta) = d\left(\left.\frac{d}{dt}\right|_{t=t_0}f_t^*\eta\right) = d(f_{t_0}^*(\mathcal{L}_{X_{t_0}}\eta)) = f_{t_0}^*(\mathcal{L}_{X_{t_0}}(d\eta)), $$
where in the third equality, $d$ commutes with the time derivative by equality of mixed partials. (One can see this by writing it out in local coordinates.)
\begin{align*} \left.\frac{d}{dt}\right|_{t=t_0} f_t^*(\alpha\wedge\beta) &= \left.\frac{d}{dt}\right|_{t=t_0} (f_t^*\alpha)\wedge(f_t^*\beta) \\ &= \left(\left.\frac{d}{dt}\right|_{t=t_0}f_t^*\alpha\right)\wedge(f_{t_0}^*\beta) + (f_{t_0}^*\alpha)\wedge\left(\left.\frac{d}{dt}\right|_{t=t_0}f_t^*\beta\right) \\ &= (f_{t_0}^*(\mathcal{L}_{X_{t_0}}\alpha)) \wedge (f_{t_0}^*\beta) + (f_{t_0}^*\alpha)\wedge(f_{t_0}^*(\mathcal{L}_{X_{t_0}}\beta)) \\ &= f_{t_0}^*\left( (\mathcal{L}_{X_{t_0}}\alpha)\wedge\beta + \alpha\wedge(\mathcal{L}_{X_{t_0}}\beta) \right) \\ &= f_{t_0}^*(\mathcal{L}_{X_{t_0}}(\alpha\wedge\beta)). \end{align*}
The product rule in the second equality can be proven using the exact same method as the single-variable calculus product rule: write out the limit definition of the derivative, and add and subtract a suitable term.
I'd like to remark that, following the exact same outline, one gets the following formula for a time-dependent family $\eta_t$ of differential forms:
$$ \frac{d}{dt} f_t^*\eta_t = f_t^*\left(\mathcal{L}_{X_t}\eta_t + \frac{d}{dt}\eta_t\right). $$
Step 1 becomes a tiny bit trickier. Steps 2 and 3 don't really change, other than becoming a little messier in notation. This kind of formula shows up when you're using a Moser-type argument, such as in a proof of Darboux's theorem. (This is why I wanted to bring it up.)