[Math] Given the joint probability distributions of $X$ and $Y$ for $Y = R\,X+C$, find the probability distributions of $R$ and $C$

linear regressionpr.probabilityprobability distributions

Let $R$, $C$, and $X$ be independent random variables defined on $(0,\infty)$ and
$$Y=\underbrace{R\, X}_{Z}+C.$$
We are given the joint probability distribution of $X$ and $Y$, $P_{XY}(x,y)$ and are asked to calculate the probability distributions of $R$ and $C$.

This is kind of like a regression problem, except I want the full probability distributions for the slope and intercept, not just their mean.

Here is what I have so far
$$
\begin{align}
P_{XY}(x,y) &= P_X(x)P_Y(y|x)\\
&= P_X(x)\int_0^\infty P_C(c)P_Z(y-c|x)dc\\
&= P_X(x)\int_0^\infty P_C(c)\frac1xP_R\left(\frac{y-c}{x}\right)dc\\
&= \frac{P_X(x)}{x}\int_0^\infty P_C(c)P_R\left(\frac{y-c}{x}\right)dc
\end{align}$$
Therefore,
$$ \frac{x\, P_{XY}(x,y)}{P_X(x)} = \int_0^\infty P_C(c)\,P_R\left(\frac{y-c}{x}\right)dc.$$
The right hand side is something like a convolution (not quite), and its value is known for every pair of x and y. How do I find $P_C$ and $P_R$? Any hints for analytical or numerical solution will be appreciated.

I am reposting this from StackExchange: https://math.stackexchange.com/q/2541446/491395

Edit: As Bjørn's answer below shows, we need more assumptions for this to work. Here is what I'm trying to do: I have measured the joint probability distribution of $X$ and $Y$ and it looks like this

enter image description here

Assuming a linear model with random slope and intercept works on this data, I want to find the distribution of these slopes and intercepts. Not sure, exactly what the necessary and sufficient conditions are for this to be possible.

Edit 2: Here is a very inefficient way to do this:

Consider the conditional expected value of $Y$ given $X$
$$
\newcommand\mean[1]{\left\langle{#1}\right\rangle}
\mean{Y|X=x} = \mean{RX+C|X=x} = \mean R x +\mean C.
$$
Using two values for $x$ we can find the mean values of $R$ and $C$. Now consider the second conditional moment
$$
\mean{Y^2|X=x} = \mean{(RX+C)^2|X=x} = \mean{R^2} x^2 +\mean{C^2}+2\mean R\mean C x.
$$
Again using two values of $x$ and the previously measured values of $\mean C$ and $\mean R$, we can find the second moments of $C$ and $R$. Inductively, we can find all the moments of $C$ and $R$.

Now, is there a cleaner, more efficient way to do this?

Best Answer

It's not possible.

Let $X$ be constant equal to 1.

Let $B_1,B_2,B_3$ be independent Bernoullis.

Let $R_1=B_1+B_2$, $C_1=B_3$.

Let $R_2=B_1$, $C_2=B_2+B_3$.

Then $R_1X+C_1=R_2X+C_2$. So even if you know the distribution of $Y$, it does not determine the distributions of $R$ and $C$.

Related Question