Assume without loss of generality that $0 < p_1 \leq p_2 \leq \cdots \leq p_k$ and $X = X_1 + \cdots + X_k$ where $X_i$ are independent geometric random variables with parameter $p_i$. Let $\mu = \mathbb E X = \sum_{m=1}^k \frac{1}{p_m}$.
Below, we show a considerably stronger result than that claimed in the question. In particular:
Claim: For any $c \geq 2$ and any $k$, we have
$$\renewcommand{\Pr}{\mathbb P}\newcommand{\E}{\mathbb E}
\Pr(X > c (\mu + t) ) \leq (1-p_1)^t \exp(-(2c-3)k/4) \>.
$$
Proof: By Markov's inequality applied to $e^{s X}$, for any $s > 0$,
$$
\Pr(X > c (\mu + t) ) \leq e^{-s c t} e^{-s c \mu} \prod_{m=1}^k \E e^{s X_m} \>.
$$
It is an easy exercise to check that
$$
\E e^{s X_m} = \frac{p_m e^s}{1-(1-p_m) e^s} = \Big(1-\frac{1-e^{-s}}{p_m}\Big)^{-1}
$$
Let $s = -\frac{1}{c}\log(1-p_1)$ so that $e^{-sc} = (1-p_1)$. Then, we get that
$$
\Pr(X > c (\mu + t) ) \leq (1-p_1)^t \exp( a \mu - {\textstyle\sum_{m=1}^k} \log(1-b/p_m)) \>,
$$
where $a = \log(1-p_1)$ and $b = 1-(1-p_1)^{1/c}$.
Recalling the definition of $\mu$ and concentrating on the exponent of the last term, we need to find some $c$ such that
$$
\sum_{m=1}^k \frac{a}{b}\frac{b}{p_m} - \log(1 - b/p_m) < 0 \>.
$$
Letting $c \geq 2$ and using Bernoulli's inequality, we have $b = 1-(1-p_1)^{1/c} \leq p_1/c \leq p_1 / 2$ and so, in particular $b/p_m \leq 1/2$ for all $m$.
Since, for $0 < z \leq 1/2$, the inequality $\log(1-z) \geq -z - z^2$ holds, we get that
$$
\sum_{m=1}^k \frac{a}{b}\frac{b}{p_m} - \log(1 - b/p_m) \leq \frac{k}{2} (\frac{a}{b} + \frac{3}{2}) \>.
$$
But, $a = \log(1-p_1) < 0$ and $b \leq p_1 / c$, so
$$
\frac{a}{b} \leq \frac{c\log(1-p_1)}{p_1} \leq -c \>.
$$
Putting everything together, we have shown that
$$
\Pr(X > c (\mu + t) ) \leq (1-p_1)^t \exp(-(2c-3)k/4)
$$
as claimed.
Notes: We observe the following:
- We can take $c = 2$ even in the case that $p = p_m > 0$ for all $m$.
- The special case where $p_m = m/k$ corresponds to the coupon collector problem. There are some sharper asymptotics in the case of the former problem as well as upper and lower bounds.
- Coupon collecting is also related to the birthday problem. For more on this connection, here's an interesting question with several different answers.
You can try a simple bookkeeping here like this. If $p$ is the probability of success, than, obviously, you get
$$
P(X=2)=p^2+(1-p)^2=1-2p+2p^2,
$$
because either you have two successes or two failures, and these two events are disjoint.
Next,
$$
P(X=3)=(1-p)p^2+p(1-p)^2=(1-p)p.
$$
Two more:
$$
P(X=4)=p(1-p)p^2+(1-p)p(1-p)^2=(1-p)p(1-2p+2p^2).
$$
$$
P(X=5)=(1-p)p(1-p)p^2+p(1-p)p(1-p)^2=(1-p)^2p^2.
$$
Do you see a pattern here? If not, try a couple more values for $n$. Finally, you get
$$
P(X=2k)=(p(1-p))^{k-1}(1-2p+2p^2),\quad k=1,2,\ldots
$$
$$
P(X=2k+1)=(p(1-p))^{k},\quad k=1,2,\ldots
$$
The sanity check shows that $\sum_k [P(X=2k)+P(X=2k+1)]=1$ as it should be.
Using the found distribution it is possible to find $E[X]$ and Var$[X]$ if you know how to find sums of the form
$$
\sum_k kp^k,\quad \sum_k k^2p^k.
$$
Best Answer
One can think of $\text{NB}(r,q)$ as the number of failures that occur before the experiment is stopped at the $r$th success, with $q$ being the probability of failure. (Note that I have reversed the roles of success and failure.)
Thus, $T_k - k$ follows the $\text{NB}(k, 1-p)$ distribution, so $$E[T_k] = k + \frac{(1-p) k}{p} = \frac{k}{p}.$$