Solved – Joint distribution of dependent exponential variables

exponential distributionjoint distributionmultivariate analysis

I am trying to model a joint probability constraint of a union of correlated / uncorrelated exponential functions.

Formally, assume we have random variables $A_i \sim exp(\theta_i), \forall i \in {1…n}$. The rate of each exponential distribution is a random variable itself (suppose we have a general distribution, G) thus $\theta_i \sim G_i(.)$. The objective is to get a closed form expression for the following chance constraint:

$Prob( \bigcup_{i=1}^{i=m}(A_i > s_i )) \leq \epsilon$, $\forall s_i$ constants

  1. If we assume that random variables $A_i$ are independent then the joint constraint becomes :$\sum_{i=1}^{i=m} Prob(A_i \geq s_i) \leq \epsilon $. In that case each $ Prob(A_i \geq s_i)$ can be modeled as a mixture of exponential distribution. If $G_i(.)$ is a discrete probability distribution, in that case we could easily obtain a closed form expression for the constraint . (Is this correct?)

  2. If variables $\theta_i$’s are correlated, I am stuck on what could be possible. If I don’t model dependence and assume independence in that case I guess I could get an upper bound on the joint constraint. Also, I was wondering if assuming linear dependence between $\theta_i$’s would make sense or not. Hence I could pose $\theta_i = a_i*\lambda + b_i$ with $a_i$ and $b_i$ different linear coefficients, and $\lambda$ a common random variable. I am again confused whether this would make sense or not. I have never used copula’s but I guess they could be useful in that case (Any comments appreciated)

I would appreciate your comments.

Best Answer

The union bound is true whether the variables are independent or not. The inequality

$$P(\bigcup_i (A_i \gt s_i)) \le \sum_i P(A_i \gt s_i)$$

does not require the $A_i$ to be independent, so if that's good enough then you have what you need already. If you assume independence, you get an equation,

$$P(\bigcup_i (A_i \gt s_i)) = 1 - \prod_i (1-P(A_i \gt s_i)) $$

but in many situations where you are trying to bound the probability by something small, you might use the union bound anyway even if you know the variables are independent. $\sum_i P(A_i \gt s_i)$ is simpler than $1 - \prod_i (1-P(A_i \gt s_i))$ and when the probabilities are small they don't differ by much .