[Math] Does a conditional normal distribution imply an unconditional normal distribution

normal distributionprobability

I have often seen it claimed that for scalar random variables $y$ and $x$, the conditional normal distribution
$$
y\mid x\sim N(0,x^2)
$$
also implies the unconditional normal distribution
$$
y\sim N(0,x^2).
$$
Is this true? The result seems quite surprising, but maybe I'm just not appreciating it.

The explanations I have seen usually state that the conditional normal distribution implies
$$
x^{-1}(y\mid x)\sim N(0,1)
$$
(provided $x\neq 0$) in which the right-hand-side does not depend on $x$. Because the right-hand-side does not depend on $x$, we can drop the conditioning to obtain
$$
x^{-1}y\sim N(0,1)
$$
and then rearrange to get
$$
y\sim N(0,x^2).
$$
Is this argument logical? It seems a bit informal, although I haven't been able to contradict it.

The context is to conduct inference on a deterministic parameter $\mu$ when
$$
y\mid x\sim N(\mu, x^2)
$$
based on an observation of $(y,x)$. One could consider the interval
$$
(y- 1.96|x|, y+ 1.96|x|)
$$
which contains $\mu$ with 95% confidence conditional on $x$. Does this interval also contain $\mu$ with 95% confidence unconditional on $x$?

An example appears on page 35 of Hayashi, F. 2000. Econometrics:
enter image description here

Best Answer

I don't know where you could have seen that claimed, but it doesn't make sense unless $x$ is a fixed constant. If $x$ is a random variable, it's not clear what $y\sim N(0,x^2)$ would mean.

If $x$ is constant, then one can say that if $x^{-1}y\sim N(0,1)$ then $y\sim N(0,x^2)$. It is also true that if the conditional distribution of one random variable given another does not depend on the other, then the marginal (or "unconditional") distribution of the first is the same as the conditional distribution.

But going from $x^{-1}y\sim N(0,1)$ to $y\sim N(0,x^2)$ is wrong unless $x$ is equal to some constant with probability $1$.

Related Question