[Math] Conditional probability distribution with Gaussian noise

machine learningnormal distributionstatistics

If I have a relationship as follows:

$$Y = a X + G(0,\sigma^2),\text{ so }y = a X + \text{some Gaussian noise}.$$

The conditional probability distribution of $y$ given $x$, i.e. $P(y|x)$, is equal to a Gaussian with mean $= a X$ and variance $= \sigma^2$.

I intuitively understand this as the expected value for $y$ should be $a X$ and this will vary due to the noise with the same variance of the noise. Is there a formal proof for this?

Thanks,
Aly

Best Answer

Instead of a 'formal' proof, I suggest you be sure to understand these simple facts:

  1. If $Z = k + Y$, where $k$ is a constant, then the probability density of $Z$ is the same of $Y$ except for a shift : $f_Z(z) = f_Y(z-k)$. As corolary: $E(Z) = k + E(Y)$ and $Var(Z) = Var(Y)$.

  2. If $Z = X + Y$ and we condition on $X$ (which means that we are given the value of $X$), then $X$ can be regarded as a constant, and the above applies. More formally (but, I insist, this should not be necessary to understand the previous) $f_{Z|X}(z|x) = f_Y(z-x)$

Related Question