Density of a sum of independent discrete and continuous random vectors

probabilityprobability distributionsprobability theory

There is a well known fact that a sum of a discrete random variable and a continuous random variable (which are independent) is continuous as well. Is the same statement true for the random vectors?

Specifically, I wanted to calculate a density of a vector
$$
(X_1, \ldots, X_n) + (U_1, \ldots, U_n)
$$

where $(X_1, \ldots, X_n)$ is a random variable with values in $\mathbb{Z}_+^{n}$ and $U_1, \ldots, U_n$ are i.i.d with a uniform distribution over the interval $(0, 1)$ and are independent of the vector $(X_1, \ldots, X_n)$. I managed to compute the cumulative distribution function of the sum and differentiate it by all the variables which would give me the candidate for a density:
$$
\mathbb{R}_+^{n} \ni (x_1, \ldots, x_n) \mapsto \mathbb{P}(X_i = \lfloor x_i \rfloor, 1 \leq i \leq n)
$$

but I'm not sure how to prove that the integral of that function over $\mathbb{R}^n_+$ is equal to 1.

Best Answer

If $X$ and $Y$ are independent random vectors and $Y$ has a density then so does $X+Y$. This is an easy consequence of Fubini's Theorem.

In measure theoretic language if $\nu << m$ then $\mu *\nu << m$ for any probability measure $\mu$ (wheer $m$ is Lebesgue measure.

Related Question