You have two sequences: $p_1,p_2,p_3,\ldots$ and $\lambda_1,\lambda_2,\lambda_3,\ldots$ knowing that the first is non-negative and sums to $1$.
You also have a method of calculating the second sequence from the first.
For (a) you need to show that the algorithm for sampling has the desired effect.
So using the algorithm and taking your example
$$P(X=2) = P(U_1 \ge \lambda_1)P(U_2 \lt \lambda_2) =(1-\lambda_1) \lambda_2 $$ $$= \left(1-\dfrac{p_1}{1}\right)\left(\dfrac{p_2}{1-p_1}\right) = p_2$$
which is the result you want.
You need to extend this to all other possible values of $X$.
My question is about the possibility of showing equivalence between
the hazard rate, the conditional probability (of failure) and a
likelihood function.
TLDR; There is no such equivalence.
Likelihood is defined as
$$ \mathcal{L}(\theta \mid x_1,\dots,x_n) = \prod_{i=1}^n f_\theta(x_i) $$
so it is a product of probability density functions evaluated at $x_i$ points, given some fixed value of parameter $\theta$.
So it has nothing to do with hazard rate, since hazard rate is probability density function evaluated at $x_i$ point parametrized by $\theta$, divided by survival function evaluated at $x_i$ parametrized by $\theta$
$$ h(x_i) = \frac{f_\theta(x_i)}{1-F_\theta(x_i)} $$
Moreover, likelihood is not a probability and it is not a conditional probability. It is a conditional probability only in Bayesian understanding of likelihood, i.e. if you assume that $\theta$ is a random variable.
Your understanding of conditional probability also seems to be wrong:
Intuitively, all conditional probabilities are purely multiplicative
processes. [...] Is it generally true, that if all conditional
probablities are purely multiplicative processes of numbers between
one and zero, they all decrease with some exponential rate $\lambda$?
The answer is no. Conditional probability is a relation between joint and individual probabilities
$$ P(A \mid B) = \frac{P(A \cap B)}{P(B)} $$
So it is not a process, and it is not multiplicative. Moreover, multiplicative relation is a definition of independence
$$ P(A \cap B) = P(A)\,P(B) $$
or equivalently
$$ P(A \mid B) = P(A) $$
Even if you are talking about a random process, then this is not true. To give an example, imagine a series of coin tosses, if they are independent, then probability of tossing head given that previous toss resulted in tails is simply a (unconditional) probability of tossing head.
Best Answer
The hazard is indeed a rate. It is the expected number of events a person can expect per time unit conditional on being at risk, i.e. not having died before. Say we are studying the time until you get the flu [influenza] , and we measured time in months and we got a hazard rate of .10, that is, a person is expected to get the flu .10 times per month assuming the hazard remains constant during that month. We could just as well measure time in decades (120 months), and we would get a hazard rate of 12, i.e. a person is expected to get flu 12 times per decade. These are just different ways of saying the exact same thing.
This is easy to see with something like the flu, which you can easily get multiple times. It is a bit harder to see when we talk about dying, which typically happens only once. But that is a substantive problem: from a statistical point of view the expectation could still be larger than 1, which means you are unlikely to survive a unit of time.