Prove that if at least one of two random variables has continuous distribution, then their sum also has continuous distribution

measure-theoryprobabilityprobability distributionsprobability theoryrandom variables

So, let's say that I was given two independent random variables $\xi$ and $\eta$ and was told that $\xi$ has continuous distribution (basically, $P(\xi = c) = 0\;\;\forall\;c\in \mathbb{R}$)

How can I prove that in such case their sum also will have continuous distribution? Problem that I have here is that I know that if two variables are independent then we can say $P_{\xi + \eta} = P_\xi *P_\eta$, where $*$ stands for measures convolution operation, but I am nor really sure how to compute such a convolution. Can you provide some explanation or intuition on how to calculate such integrals? In my case I need to know integral like this:

$$ \int_{\mathbb{R}^2} \mathbb{1}_{\{c\}}(x + y) \; dP_\xi(x) \; dP_\eta(y)$$

And I have no rigorous way of showing that it is actually zero.

Best Answer

By Fubini/Tonelli Theorem we have $$ \int_{\mathbb{R}^2} \mathbb{1}_{\{c\}}(x + y) \; dP_\xi(x) \; dP_\eta(y)=\int_{\mathbb R} P_{\xi}( c-y)dP_{\eta} y= 0$$ since $P_{\xi} (c-y)=0$ for every $y$.