Based on your most recent comment, I think you should consider
a 2-state Markov chain to produce a sequence of random variables
$X_i,$ taking values in $\{0, 1\},$ roughly as follows:
Start with
a deterministic or random $X_1.$ Then
(i) $P\{X_{i+1} = 1|X_i = 0\} = \alpha,$ and
(ii) $P\{X_{i+1} = 0|X_i = 1\} = \beta.$
The parameters $\alpha$ and $\beta$ are the respective
probabilities of 'changing state' from one $X_i$ to the next. To avoid certain kinds of deterministic sequences, you may want to use $0 < \alpha, \beta < 1.$ If $\alpha = 1 - \beta,$ then the sequence is
independent.
By induction, one can show that
$$P\{X_{1+r} = 0|X_1 = 0\} = \frac{\beta}{\alpha+\beta}
+ \frac{\alpha(1-\alpha - \beta)^r}{\alpha+\beta}.$$
If $|1-\alpha - \beta| < 1$, then in the long run
$P\{X_n = 0\} \approx \beta/(\alpha+\beta),$ regardless of the
value of $X_1.$
Moreover, there are similar formulas for the '$r$-step transitions'
from 0 to 1, 1 to 0, and 1 to 1. Of course, I am skipping over
a lot of detail here.
Perhaps
there is a rich enough variety of models here to satisfy your
curiosity as to what happens when independence fails in this way.
Later chapters in many probability books have a complete
development of the theory of 2-state Markov chains. Also there
are several good elemeentary books just on Markov chains.
[Google '2-state Markov Chain'. One reference among many is Chapter 6 of Suess and Trumbo (2010), Springer, in which I have a personal interest.]
I think you made a mistake in the calculation. First geometric random variable is at least 1, hence $z\ge 2$.
\begin{align*}
P[Z=z] &= \sum_{k=1}^{z-1} P[X=k]P[Y=z-k]\\
&=\sum_{k=1}^{z-1} p(1-p)^{k-1} \cdot q(1-q)^{z-k-1}\\
&=\frac{pq}{(1-p)} (1-q)^{z-1}\sum_{k=1}^{z-1} (\frac{1-p}{1-q})^k \\
&=\frac{pq}{(1-p)} (1-q)^{z-1} \cdot \frac{1-p}{1-q}\frac{1-(\frac{1-p}{1-q})^{z-1}}{1-\frac{1-p}{1-q}}\\
&=\frac{pq}{p-q} (1-q)^{z-1} \cdot [1 -(\frac{1-p}{1-q})^{z-1}]\\
&=\frac{pq}{p-q}[(1-q)^{z-1} - (1-p)^{z-1}],
\end{align*}
which is the desired result.
Best Answer
For a geometric random variable $Y$ with parameter $y$, $$\begin{align} P\{Y > n\} &= \sum_{i=n+1}^\infty P\{Y = i\} = \sum_{i=n+1}^\infty y(1-y)^{i-1}\\ &= y(1-y)^n[1 + (1-y) + (1-y)^2 + \cdots ]\\ &= y(1-y)^n \times \frac{1}{1-(1-y)}\\ &=(1-y)^n. \end{align}$$ Actually, the end result is more easily derived by arguing that the event $\{Y > n\}$ occurs if and only if the first $n$ trials ended in failure, and the probability of this is just $(1-y)^n$. Regardless of which way you computed $P\{Y > n\}$, you can compute $$P\{X = n, Y> n\} = P\{X = n\}P\{Y > n\} = x(1-x)^{n-1}(1-y)^n$$ and hence $$\begin{align} P\{X < Y\} &= \sum_{n=1}^\infty P\{X = n, Y > n\}\\ &= \sum_{n=1}^\infty x(1-x)^{n-1}(1-y)^n\\ &= x(1-y) [1 + (1-x)(1-y) + ((1-x)(1-y))^2 + \cdots\\ &= x(1-y)\times \frac{1}{1 - (1-x)(1-y)}\\ &= \frac{x(1-y)}{x+y-xy}. \end{align}$$ This answer too has a nice interpretation. Let us carry out the $n$-th trials simultaneously. Then, we can ignore all instances in which failures were recorded on both trials -- the next pair of trials must be conducted -- and concentrate on the very first time that success occurred on at least one of the paired trials, possibly on both trials. Conditioned on at least one success, an event of probability $x+y-xy$, (remember $P(A\cup B) = P(A)+P(B)-P(A)P(B)$ for independent events?), the probability that $X < Y$ is just the probability that $X$ occurred and $Y$ did not, which is $x(1-y)$. Thus we have $$P\{X < Y\} = \frac{x(1-y)}{x+y-xy}, ~~ P\{X > Y\} = \frac{(1-x)y}{x+y-xy}, ~~ P\{X = Y\} = \frac{xy}{x+y-xy}$$ without any need for summing series or indeed without even writing down the joint pmf etc.