Based on your most recent comment, I think you should consider
a 2-state Markov chain to produce a sequence of random variables
$X_i,$ taking values in $\{0, 1\},$ roughly as follows:
Start with
a deterministic or random $X_1.$ Then
(i) $P\{X_{i+1} = 1|X_i = 0\} = \alpha,$ and
(ii) $P\{X_{i+1} = 0|X_i = 1\} = \beta.$
The parameters $\alpha$ and $\beta$ are the respective
probabilities of 'changing state' from one $X_i$ to the next. To avoid certain kinds of deterministic sequences, you may want to use $0 < \alpha, \beta < 1.$ If $\alpha = 1 - \beta,$ then the sequence is
independent.
By induction, one can show that
$$P\{X_{1+r} = 0|X_1 = 0\} = \frac{\beta}{\alpha+\beta}
+ \frac{\alpha(1-\alpha - \beta)^r}{\alpha+\beta}.$$
If $|1-\alpha - \beta| < 1$, then in the long run
$P\{X_n = 0\} \approx \beta/(\alpha+\beta),$ regardless of the
value of $X_1.$
Moreover, there are similar formulas for the '$r$-step transitions'
from 0 to 1, 1 to 0, and 1 to 1. Of course, I am skipping over
a lot of detail here.
Perhaps
there is a rich enough variety of models here to satisfy your
curiosity as to what happens when independence fails in this way.
Later chapters in many probability books have a complete
development of the theory of 2-state Markov chains. Also there
are several good elemeentary books just on Markov chains.
[Google '2-state Markov Chain'. One reference among many is Chapter 6 of Suess and Trumbo (2010), Springer, in which I have a personal interest.]
Your definition of a geometric random variable is not quite consistent with the normal definition; normally one would say that $X$ is the trial on which one has the first success (in a sequence of $p_1$ Benoulli variables).
So that means
$$\mathbf{P}(X = k) = (1-p_1)^{k-1} p_1, \qquad k = 1,2,\ldots$$
In particular the distribution is defined only for integers greater than or equal to $1$. In your definition (which I will denote $\widehat X$, you allow $\widehat X = 0$ to be non-zero; that is you assume the density is
$$\mathbf{P}\left( \widehat X = k\right) = (1-p_1)^k p_1, \qquad k = 0,1,\ldots$$
This also has an interpretation: this is you are counting the number of failures before success, so your definition is equivalent to
$$ \widehat X = X-1.$$
From here we can determine the distribution of $Z = X + Y$ using the method you have
\begin{align*}
\mathbf{P}(Z = n) & = \sum_{k=0}^n P(X = k) P(Y = n-k) \\
& = \sum_{k=1}^{n-1} P(X = k) P(Y = n-k) \\
& = p_1 p_2 \sum_{k=1}^{n-1} (1-p_1)^{k-1} (1-p_2)^{n-k-1} \\
& = p_1 p_2\frac{(1-p_2)^{n-1}}{(1-p_1)} \sum_{k=1}^{n-1} \left( \frac{1-p_1}{1-p_2} \right)^{k}.
\end{align*}
From here you can manipulate the geometric series (much as you do above) to derive
$$\mathbf{P}(Z = n) = \frac{p_1p_2}{p_1 - p_2}
\big( (1-p_2)^{n-1} - (1-p_1)^{n-1} \big), \qquad n = 2,3,\ldots
$$
Note that $Z$ can only take values greater than or equal to $2$.
Best Answer
Suppose $\Pr[X = n] = p_1(1-p_1)^n$ and $\Pr[Y = n] = p_2(1-p_2)^n$ for $n = 0,1,2,\ldots$, where $p_1 \neq p_2$. Then, for any $n = 0,1,2,\ldots$, we have:
$\Pr[Z = n]$ $= \Pr[X+Y = n]$ $= \displaystyle\sum_{k = 0}^{n}\Pr[X = k \ \text{AND} \ Y = n-k]$
$\overset{(1)}{=} \displaystyle\sum_{k = 0}^{n}\Pr[X = k]\Pr[Y = n-k]$ $= \displaystyle\sum_{k = 0}^{n}p_1(1-p_1)^kp_2(1-p_2)^{n-k}$
$= p_1p_2(1-p_2)^n\displaystyle\sum_{k = 0}^{n}\left(\dfrac{1-p_1}{1-p_2}\right)^k$ $\overset{(2)}{=} p_1p_2(1-p_2)^n\dfrac{1-\left(\tfrac{1-p_1}{1-p_2}\right)^{n+1}}{1-\tfrac{1-p_1}{1-p_2}}$
$= p_1p_2\dfrac{(1-p_2)^{n+1}}{1-p_2}\dfrac{1-\left(\tfrac{1-p_1}{1-p_2}\right)^{n+1}}{1-\tfrac{1-p_1}{1-p_2}}$ $= \dfrac{p_1p_2\left[(1-p_2)^{n+1}-(1-p_1)^{n+1}\right]}{(1-p_2)-(1-p_1)}$
$= \dfrac{p_1p_2\left[(1-p_2)^{n+1}-(1-p_1)^{n+1}\right]}{p_1-p_2}$.
$(1)$ This follows since $X$ and $Y$ are independent.
$(2)$ Here, we have used the formula for the sum of a geometric series, and the fact that $p_1 \neq p_2$.
I'm not sure if this distribution has a name.