Note that
$$\mathbf E[\tau(t)]=\sum_{j=0}^\infty \mathbf P(\tau(t)>j)\le t+1+\sqrt{t}+\sum_{j=[t+\sqrt{t}]+1}^\infty \mathbf P(\tau(t)>j).$$
To bound the sum we use $\{\tau(t)>j\}\subset \{U_j<t\mu_X\mbox{ or } V_j<t\mu_Y\} $. Therefore,
$$
\mathbf P(\tau(t)>j)\le \mathbf P(U_j<t\mu_X)+\mathbf P(V_j<t\mu_Y)
$$
Now we can use the fact that $X_i$ and $Y_i$ are bounded and apply Hoeffding's inequality (https://en.wikipedia.org/wiki/Hoeffding%27s_inequality), which gives
for $j>t$,
$$
\mathbf P(U_j<t\mu_X)=\mathbf P(U_j-j\mu_X<(t-j)\mu_X)\le
\exp\left(-\frac{2(t-j)^2\mu_X^2}{4j(\mu_XK_X)^2}\right)
=\exp\left(-\frac{(t-j)^2}{2j(K_X)^2}\right),
$$
where I used the condition $|X_n|\le K_X\mu_X$. Similarly,
$$
\mathbf P(U_j<t\mu_X)\le\exp\left(-\frac{(t-j)^2}{2j(K_Y)^2}\right),
$$
Then,
$$
\mathbf E[\tau(t)]\le t+1+\sqrt{t}+\sum_{j=[t+\sqrt{t}]+1}^\infty \left(e^{-\frac{(j-t)^2}{2j(K_Y)^2}}+e^{-\frac{(j-t)^2}{2j(K_X)^2}}\right)
$$
Now note that for $j\ge t+\sqrt{t}$ we have the following estimate $j\le (j-t)\sqrt{t}$, which implies that
$$
\sum_{j=[t+\sqrt{t}]+1}^\infty \left(e^{-\frac{(j-t)^2}{2j(K_Y)^2}}+e^{-\frac{(j-t)^2}{2j(K_X)^2}}\right)\le
\sum_{j=[t+\sqrt{t}]+1}^\infty \left(e^{-\frac{(j-t)}{2(K_Y)^2\sqrt{t}}}+e^{-\frac{(j-t)}{2(K_X)^2\sqrt{t}}}\right)
$$
Now the latter sum is simply a sum of geometric series. Hence,
$$
\mathbf E[\tau(t)]\le t+1+\sqrt{t}+\frac{1}{1-e^{-\frac{1}{2(K_X)^2\sqrt{t}}}}
+\frac{1}{1-e^{-\frac{1}{2(K_Y)^2\sqrt{t}}}}
$$
and finally,
$\mathbf E[\tau(t)]\le t+2(1+K_X^2+K_Y^2)\sqrt{t}$ for large $t$.
The constant in front of $\sqrt{t}$ is not sharp.
In principle, it is possible to obtain the exact behaviour $\mathbf E[\tau(t)]=t+A\sqrt t +o(\sqrt t),\quad t\to\infty$ and identify $A$. I will briefly sketch how it can be done.
As above, it is sufficient to obtain asymptotics for $\mathbf P(\tau(t)>j)$. For $j\ge 2t$ this probability will be exponentially decreasing and contribution to $\mathbf E[\tau(t)]$ is negligible. Then, for $j\le 2t$ we can use strong coupling of $(U_j,V_j)$ with 2d Brownian motion $(U(t),V(t))$. Brownian motion $(U(t),V(t))$ should have the same drift and covariance as corresponding random walk. Then strong coupling ensures that with high probability the distance between random walk and Brownian motion is less than $\log(t)$ for $j\le 2t$. Then, on the latter event,
$$
\tau^{BM}(t-\log t)\le \tau(t)\le \tau^{BM}(t+\log t),
$$
where $\tau^{BM}(t)$ is the corresponding exit time for the Brownian motion.
Therefore, it is sufficient to show for the Brownian motion that $\mathbf E[\tau^{BM}(t)]t+A\sqrt t +o(\sqrt t)$. Now this can be done as follows a) we change the measure to remove the drift of the Brownian motion b) we do a linear transformation to remove correlation between coordinates to obtain standard 2d Brownian motion. Now the question can be treated as a question about the exit time of the sBM from a cone. For that we can use information about $\mathbf P(\tau>t, (U(t),V(t))\in dy))$ available in Brownian motion in cones (doi:10.1007/s004400050111).
The inequality you are citing should have a power 2 on the Cheeger constant (a.k.a the bottleneck ratio), so the inequality should read:
$$t_{\rm mix} \le C\log\left(\min_{v\in V}\dfrac{1}{\pi(v)}\right)\Phi(L)^{-2} \,.$$
This need not hold on a simple graph without loops; e.g. it fails if the graph is bipartite, where the mixing time is infinite. The inequality holds for lazy simple random walk, where the number of loops added at each vertex equals its degree. In the most general case you need to also bound the most negative eigenvalue of the chain. For the simplest case of the inequality combine inequality (12.10) and Theorem 13.10 (Due to Jerrum-Sinclair and Lawler-Sokal) in [1].
[1] Markov Chains and Mixing Times; 2nd edition, https://pages.uoregon.edu/dlevin/MARKOV/mcmt2e.pdf
Best Answer
It's $\dfrac{1}{qs} - G(s)$, so you could call it $E\left[ \dfrac{1}{qs} - s^\tau \right]$. Of course it's not a probability generating function, because it has negative coefficients except for the $1/(qs)$ and is not $1$ at $s=1$.